Create an account


Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Microsoft - Putting differential privacy into practice to use data responsibly

#1
Putting differential privacy into practice to use data responsibly

<div style="margin: 5px 5% 10px 5%;"><img src="https://www.sickgaming.net/blog/wp-content/uploads/2020/10/putting-differential-privacy-into-practice-to-use-data-responsibly.jpg" width="1024" height="768" title="" alt="" /></div><div><div><img src="https://www.sickgaming.net/blog/wp-content/uploads/2020/10/putting-differential-privacy-into-practice-to-use-data-responsibly.jpg" class="ff-og-image-inserted"></div>
<p>Data can help businesses, organizations and societies solve difficult problems, but some of the most useful data contains personal information that can’t be used without compromising privacy. That’s why Microsoft Research spearheaded the <a href="https://link.springer.com/chapter/10.1007%2F11681878_14">development of differential privacy</a>, which safeguards the privacy of individuals while making useful data available for research and decision making. Today, I am excited to share some of what we’ve learned over the years and what we’re working toward, as well as to announce a new name for our open source platform for differential privacy – a major part of our commitment to collaborate around this important topic.</p>
<p>Differential privacy consists of two components: statistical noise and a privacy-loss budget. Statistical noise masks the contribution of individual data points within a dataset but does not impact the overall accuracy of the dataset, while a privacy-loss budget keeps track of how much information has been revealed through various queries to ensure that aggregate queries don’t inadvertently reveal private information.</p>
<p>Since differential privacy was created, Microsoft has conducted research and developed and deployed technologies with the goal of enabling more people to participate in, contribute to and benefit from differential privacy. Last year, we partnered with Harvard’s Institute for Quantitative Social Science (IQSS) and School of Engineering and Applied Sciences (SEAS) to announce the <a href="https://projects.iq.harvard.edu/opendp">OpenDP Initiative</a>, and earlier this year released the initial version of our open source platform. We chose to develop differential privacy technologies in the open to enable increased participation in the creation of tools that empower a larger group of people to benefit from differential privacy.</p>
<h2><strong>Introducing SmartNoise</strong></h2>
<p>In June, <a href="https://projects.iq.harvard.edu/opendp/blog/building-inclusive-community">we announced</a> that we would be renaming our open source platform to avoid any potential misunderstanding of our intentions for this project and the community. Language and symbols matter, especially when you are trying to build an inclusive community and responsibly enable AI systems.</p>
<p>I’m thrilled to share that this platform will be renamed SmartNoise. The SmartNoise Platform, <a href="https://privacytools.seas.harvard.edu/opendp">powered by OpenDP</a>, captures an essential step in the differential privacy process and follows best practices of renaming terms like whitelist and blacklist to allowlist and blocklist.</p>
<p>By using SmartNoise, researchers, data scientists and others will be able to derive new and deeper insights from datasets that have the potential to help solve the most difficult societal problems in health, the environment, economics and other areas.</p>
<h2><strong>How we’re using SmartNoise and differential privacy today</strong><strong> at Microsoft </strong></h2>
<p>As we apply differential privacy to our own products and begin to work with customers to do so, we’re learning a lot about what works and what we need to explore further.</p>
<p>Our first production use of differential privacy in reporting and analytics at Microsoft was <a href="https://www.microsoft.com/en-us/research/publication/collecting-telemetry-data-privately/">in Windows</a>, where we added noise to users’ telemetry data, enabling us to understand overall app usage without revealing information tied to a specific user. This aggregated data has been used to identify possible issues with applications and improve user experience.</p>
<p>Since then, we’ve applied differential privacy in similar ways to understand data that benefits our customers and helps us improve our products. We’ve learned that differential privacy works best in cases where a query or dataset with a limited set of computations will be refreshed on an ongoing basis – in these cases the work required to apply differential privacy pays off because you can spend the time to optimize it and then reuse that work. An example of this is the Insights for People Managers within <a href="https://docs.microsoft.com/en-us/workplace-analytics/privacy/differential-privacy#:~:text=Differential%20privacy%20offers%20a%20balance,privacy%20while%20simultaneously%20maintaining%20accuracy.">Workplace Analytics</a>. These insights enable managers to understand how the people in their team are doing and to learn how to drive change by using aggregated collaboration data without sharing any information about individuals.</p>
<p>An application of differential privacy with limited parameters but that enables interactivity is <a href="https://arxiv.org/abs/2002.05839">advertiser queries on LinkedIn</a>. Advertisers can get differentially private answers to their top-k queries (where k is a number representing how many answers the advertiser wants from the query). Each advertiser is allotted a limited number of queries, which helps to ensure that multiple queries can’t be combined to deduce private information. So, for example, an advertiser could find out which articles are being read by software engineers or employees of a particular company, but wouldn’t be able to determine which individual users were reading them.</p>
<p>Another key application area for differential privacy is in machine learning, where the goal is to produce a machine learning model that protects the information about the individual datapoints in the training dataset.</p>
<p>For example, in <a href="https://aka.ms/SuggestedRepliesMay2020">Office suggested replies,</a> we use differential privacy to narrow the set of responses to ensure that the model doesn’t learn from any replies that might violate an individual user’s privacy.</p>
<p>During the training of a machine learning model, the training algorithm can add differentially private noise and manage the privacy budget across iterations. These algorithms often take longer to train, and often require tuning for accuracy, but this effort can be worth it for the more rigorous privacy guarantees that differential privacy enables.</p>
<p>To take this scenario further, we are also exploring the potential for synthetic data in machine learning, which is currently only an option if we know the specific task or question the algorithm needs to understand. The idea behind synthetic data is that it preserves all the key statistical attributes of a dataset but doesn’t contain any actual private data. Using the original dataset, we would apply a differential privacy algorithm to generate synthetic data specifically for the machine learning task. This means the model creator doesn’t need access to the original dataset and can instead work directly with the synthetic dataset to develop their model. The synthetic data generation algorithm can use the privacy budget to preserve the key properties of the dataset while adding more noise in less essential places.</p>
<h2><strong>SmartNoise and differential privacy going forward </strong></h2>
<p>We have learned so much about differential privacy, and we’re only scraping the surface of what’s possible – and starting to understand the barriers and limitations that exist.</p>
<p>We continue to make investments in our tools, develop new ones and innovate with new practices and research. On the technical side, there are a few areas we will pursue further. Most production applications are using a known limited set of computations, so we’ll have to go further in making differential privacy work well for a larger set of queries. We will further enable interactivity, which means dynamically optimizing so queries work well without hand-tuning. We will develop a robust budget tracking system that would allow many different people to use the data. And we will adopt security measures that would allow an untrusted analyst to query and use the data without having full access to the dataset.</p>
<p>There are also policy, governance and compliance questions that need to be addressed. For example, if we are allocating budget for a dataset across a diverse set of users and potential projects, how do we decide how much budget each researcher accessing the data gets? Going forward, we will strive to answer these important questions with the help of the open source differential privacy community.</p>
<p>And synthetic data is a particularly exciting area for exploration because anyone could access and use it without privacy ramifications. However, there are still many research questions on how to effectively implement differential privacy – while still providing accurate results – when we don’t know what the analysis will look like in advance.</p>
<p>Many questions remain, and we know we will need help from the community to answer them. With the OpenDP Initiative and SmartNoise project, we announced our commitment to developing differential privacy technologies in the open to enable more people to participate and contribute, and we look forward to collaborating with and learning from all of you.</p>
<p>Gary King, director of the Institute for Quantitative Social Science at Harvard, had this to say: “We created OpenDP&nbsp;to build a far more secure foundation for efforts to ensure privacy for people around the world.&nbsp; We are proud to release SmartNoise with&nbsp;Microsoft and hope to build an active and vibrant community of researchers, academics, developers and others to fully unlock the power of data to help address some of the most pressing challenges we face.”</p>
<p>If you want to get involved in OpenDP and SmartNoise, find us on <a href="https://github.com/opendifferentialprivacy/">GitHub.</a> We will also continue to openly share our technical and non-technical learnings as we deploy differential privacy in production across the company.</p>
<p><a href="https://www.microsoft.com/en-us/research/people/slbird/"><em>Sarah Bird</em></a><em> is a principal program manager and the Responsible AI lead for Cognitive Services within Azure AI. Sarah’s work focuses on research and emerging technology strategy for AI products in Azure. Sarah works to accelerate the adoption and impact of AI by bringing together the latest innovations in research with the best of open source and product expertise to create new tools and technologies.</em></p>
</div>


https://www.sickgaming.net/blog/2020/10/...sponsibly/
Reply



Forum Jump:


Users browsing this thread:
2 Guest(s)

Forum software by © MyBB Theme © iAndrew 2016