acm-header
Sign In

Communications of the ACM

ACM News

Cracking the Crypto War


View as: Print Mobile App Share:
Ray Ozzie in his backyard, overlooking Manchester Bay in Massachusetts.

Ray Ozzie thinks he has an approach for accessing encrypted devices that satisfies both law enforcement and privacy purists.

Credit: Cole WIlson

On December 2, 2015, a man named Syed Rizwan Farook and his wife, Tashfeen Malik, opened fire on employees of the Department of Public Health in San Bernardino, California, killing 14 people and injuring 22 during what was supposed to be a staff meeting and holiday celebration. The shooters were tracked down and killed later in the day, and FBI agents wasted no time trying to understand the motivations of Farook and to get the fullest possible sense of his contacts and his network. But there was a problem: Farook's iPhone 5c was protected by Apple's default encryption system. Even when served with a warrant, Apple did not have the ability to extract the information from its own product.

The government filed a court order, demanding, essentially, that Apple create a new version of the operating system that would enable it to unlock that single iPhone. Apple defended itself, with CEO Tim Cook framing the request as a threat to individual liberty.

"We have a responsibility to help you protect your data and protect your privacy," he said in a press conference. Then-FBI chief James Comey reportedly warned that Cook's attitude could cost lives. "I just don't want to get to a day where people look at us with tears in their eyes and say, 'My daughter is missing and you have her cell phone—what do you mean you can't tell me who she was ­texting before she disappeared?' " The controversy over Farook's iPhone reignited a debate that was known in the 1990s as the Crypto Wars, when the government feared the world was "going dark" and tried—and ultimately failed—to impede the adoption of technologies that could encode people's information. Only this time, with super­computers in everybody's pockets and the endless war on terror, the stakes were higher than ever.

A few months after the San Bernardino shooting, President Obama sat for an interview at the South by Southwest conference and argued that government officials must be given some kind of shortcut—or what's known as exceptional access—to encrypted content during criminal and antiterrorism investigations. "My conclusion so far is that you cannot take an absolutist view on this," he said. "If the tech community says, 'Either we have strong, perfect encryption or else it's Big Brother and an Orwellian world'—what you'll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed, and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties."

In typical Obama fashion, the president was leaning toward a compromise, a grand bargain between those who insist that the NSA and FBI need all the information they can get to monitor potential terrorists or zero in on child abusers and those who believe building any sort of exceptional access into our phones would be a fast track to a totalitarian surveillance state. And like so many of Obama's proposed compromises, this one went nowhere. To many cryptographers, there was simply no way that companies like Apple and Google could provide the government with legal access to customer data without compromising personal privacy and even national security. Exceptional access was a form of technology, after all, and any of its inevitable glitches, flaws, or bugs could be exploited to catastrophic ends. To suggest otherwise, they argued, was flat wrong. Flat-Earth wrong. Which was, as any good engineer or designer knows, an open invitation for someone to prove them wrong.

 

From Wired
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account