acm-header
Sign In

Communications of the ACM

Practice

Differential Privacy: The Pursuit of Protections by Default


digital human face in profile, illustration

Credit: Andrij Borys Associates, Shutterstock

Over the past decade, calls for better measures to protect sensitive, personally identifiable information have blossomed into what politicians like to call a "hot-button issue." Certainly, privacy violations have become rampant and people have grown keenly aware of just how vulnerable they are. When it comes to potential remedies, however, proposals have varied widely, leading to bitter, politically charged arguments. To date, what has chiefly come of that have been bureaucratic policies that satisfy almost no one—and infuriate many.

Now, into this muddled picture comes differential privacy. First formalized in 2006, it's an approach based on a mathematically rigorous definition of privacy that allows formalization and proof of the guarantees against re-identification offered by a system. While differential privacy has been accepted by theorists for some time, its implementation has turned out to be subtle and tricky, with practical applications only now starting to become available. To date, differential privacy has been adopted by the U.S. Census Bureau, along with a number of technology companies, but what this means and how these organizations have implemented their systems remains a mystery to many.


Comments


John Canessa

Looks interesting. That said; it appears that the problem remains. The data is still collected, stored and sometimes made available to third parties for mutual business gains. Such data at rest could be compromised by cyber attacks. I fully understand that things become more complicated when using medical data. Researchers in most cases need valid data without random injections. In case of advertisements to potencial customers it could be great when the data is sold. It seems that the companies collecting it will not be using the techniques described in the article in order to better target people. BTW I enjoyed reading the article. Hopefully research will continue and better techniques will be found and will become the norm for organizations that collect and use private data.


Displaying 1 comment

Log in to Read the Full Article

Sign In

Sign in using your ACM Web Account username and password to access premium content if you are an ACM member, Communications subscriber or Digital Library subscriber.

Need Access?

Please select one of the options below for access to premium content and features.

Create a Web Account

If you are already an ACM member, Communications subscriber, or Digital Library subscriber, please set up a web account to access premium content on this site.

Join the ACM

Become a member to take full advantage of ACM's outstanding computing information resources, networking opportunities, and other benefits.
  

Subscribe to Communications of the ACM Magazine

Get full access to 50+ years of CACM content and receive the print version of the magazine monthly.

Purchase the Article

Non-members can purchase this article or a copy of the magazine in which it appears.
Sign In for Full Access
» Forgot Password? » Create an ACM Web Account