Privacy: Beyond the Four-Digit Passcode

Joey+Mink+demonstrates+outside+the+Apple+store+on+Fifth+Avenue%2C+Tuesday%2C+Feb.+23%2C+2016%2C+in+New+York.+Protesters+assembled+in+more+than+30+cities+around+the+world+to+lash+out+at+the+FBI+for+obtaining+a+court+order+that+requires+Apple+to+make+it+easier+to+unlock+an+encrypted+iPhone+used+by+a+gunman+in+December%27s+mass+murders+in+California.+%28AP+Photo%2FJulie+Jacobson%29

Joey Mink demonstrates outside the Apple store on Fifth Avenue, Tuesday, Feb. 23, 2016, in New York. Protesters assembled in more than 30 cities around the world to lash out at the FBI for obtaining a court order that requires Apple to make it easier to unlock an encrypted iPhone used by a gunman in December’s mass murders in California. (AP Photo/Julie Jacobson)

December 2, 2015 marked the worst terrorist attack on American soil since 9/11. The attackers, married couple Syed Rizwan Farook and Tashfeen Malik, killed 14 people and wounded 22 in a shooting spree that took place in the Inland Regional Center in San Bernardino, California, effectively catalyzing unprecedented measures to strengthen national security. Police can only speculate that Syed was radicalized by international terrorist groups, rather than one within the country. Because of the uncertainty in their motives, the government is understandably more concerned about the organizations or individuals still unknown to them that have influenced the devastating attack.

The couple, taken down in a police shootout encounter, was no longer a potential source of information. With few threads to follow, the FBI naturally hoped to find a lead in Syed’s iPhone to help uncover other threats of terrorism. However, after changing the iCloud password and looking blankly at the lock screen, limited to 10 guesses for the passcode, what were they to do?

They asked the company who made the phone to crack their phone. Easy.

But it wasn’t the case for Apple. It was an impressive realization: the biggest tech giant in the world had put security measures so strict on its devices that the company itself could not open this particular tiny iPhone 5C that was now strained in the national spotlight.

Privacy is ever increasing in importance with the improvements in technology. Technological solutions, such as encryption, are the only ones quick enough way to combat technologically aided attacks on privacy. At this point, a smartphone simply doesn’t go more private than having a unique 256 bit-long encryption key (think, two-hundred-fifty-six 0s and 1s in random sequence) that scrambles and unscrambles data on the iPhone. More than trillions and trillions of combinations. Technically, you can add even more 0s and 1s to the key for a more impossible challenge, but as is, encryption is powerful. Not even all the world’s computers in collaboration would be able to guess the encryption key for this one iPhone in a lifetime. Even worse (or, as I believe, even better), Apple itself keeps no record of said encryption key.  To expedite the process significantly, the FBI chose to guess the passcode, which has far less combinations by default. However, to avoid auto-wiping the phone completely amidst its blind attempts to guess, the FBI demanded that Apple overwrite its security measures — eliminate delays between guesses and ensure the device wouldn’t self-destruct. The company flatly refused, recognizing the impact of setting a dangerous precedent.

Of course, this isn’t a matter of weighing values of privacy and security to see which prevails. Rather than the false dichotomy of choosing one over the other, it’s about achieving both without overstepping the bounds of government and inadvertently weathering away civil liberties in the process.

However, the high stakes court case was cut short on April by the FBI’s revelation on March 28, 2016 that they no longer needed Apple’s help to break into the phone. However anticlimactic the update (with no formal ruling from the Supreme Court), the tensions between privacy and security have hardly dissipated. Perhaps the delay of a landmark court decision that either protected or hindered American privacy is beneficial, considering the current dynamic of the Supreme Court in limbo. With the perspective of time and judgment hinging less upon immediate crises and terrorist threats, the reasonable decision to protect privacy will dominate. Technology is evolving so quickly that laws can’t possibly catch up in real time as developments are made. We should be prescient and anticipate these problems posed by technology for all the cases, not just the one at hand, all the while keeping privacy at the forefront of our discussions.

Certainly, it is not the first time the FBI has asked Apple to do this nature of work. This time, the deliberate choice to make this a national debate in light of terrorist attack was strategic. If the FBI continues to keep secret the vulnerability in iPhone security they were able to exploit, it would be telling of what their priorities truly are. Riana Pfefferkorn, cryptography fellow at the Stanford Center for Internet and Society, pointed out, “Now that the F.B.I. has accessed this iPhone, it should disclose the method for doing so to Apple…Apple ought to have the chance to fix that security issue, which likely affects many other iPhones.” The short-term gain for the federal government is bound to be overshadowed by the long-term harm in putting millions of iPhone users’ privacy at risk.

Obviously this case, despite its truncated lifetime, extends beyond iPhone-using terrorists. What does this have to do with the rest of us who have “nothing to hide?”

From how tech experts have written about encryption, the issue is broader than the valuable information in our devices; it stretches the limitless capacity of our minds and our right to intellectual privacy.

As Phil Zimmermann, the father of Pretty Good Privacy (PGP) encryption, wrote back in the early 90s, “PGP empowers people to take their privacy into their own hands. There has been a growing social need for it. That’s why I wrote it.” He recognized that we were dealing with technology in a changed world, where we assume our communications are held in confidence in email or text. Bills including backdoor measures in encryption and point-and-tap wiretapping were bound to be abused by the government as in years past. If these emerging technological means of individual communication were not properly protected with a virtual padlock, he reasoned, our democracy as a country would be threatened if a body with so much power could scrutinize every word.

A little under two decades later, the same message echoes in Tim Cook’s personal letter to Apple customers about his company’s stance: “Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

Even for those of us understandably skeptical about corporate interests, Apple is clearly thinking ahead about setting a dangerous precedent. The logic is clear. Though information from this particular iPhone could be gathered under a new operating system created by Apple, this new master key could be requested again and again by the government for other cases, using the national security card. Furthermore, if found in the wrong hands, criminals themselves could be sitting aloft, cracking targeted phones of their own at their disposal. The ease with which hackers could find this dangerous master key (if it existed) is frightening. Usually, with power comes responsibility, but with unrestricted access to information, accountability is swept aside. It’s so easy to overlook small breaches in privacy in the name of national security, and by the time we notice how much is gone, it would be too late to reclaim our privacy.