Tim Cook’s biggest legacy - keeping encryption alive.
One of the most important precedents set by Apple during Tim Cook’s reign was undoubtedly maintaining Security and Privacy as a pillar for Apple. The lengths with which Apple goes to make sure their customers’ devices and information is safe and secure is beyond what many other companies do.
Apple has set the bar, but that bar came under attack after one of the biggest showdowns between tech companies and the U.S. Government:
The FBI vs. Apple incident of 2016
In December 2015, a terrorist attack in San Bernardino, California left 14 people dead and 22 injured. The suspect and his wife were eventually caught and killed in a shootout with US authorities. The main suspect, Syed Rizwan Farook, was using an iPhone, and the FBI wanted to get inside his phone and bypass the passcode to get more information. After being unable to bypass the device themselves and being afraid that too many unsuccessful attempts might factory reset the device, they requested Apple to make an unprecedented move - create a custom backdoor to iOS to allow the FBI full access to his device.
In other words, break device encryption completely by creating an unsecured version of iOS.
Tim Cook declined to make a custom version of iOS with a “backdoor key” because it would fundamentally put every single iPhone user at risk, and once Apple sets this precedent, all other phone makers would have to follow suit, leading to the demise of encryption, security, and privacy as we know it.
Tim Cook and Apple formally addressed this issue in a Customer Letter released on February 16, 2016:
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control. […]
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them. […]
The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.
Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.
The question is, if the terrorist had an Android phone, what would the outcome have been? Would Google allow a backdoor? I believe the answer still is not obvious today even though privacy has become more appreciated within the past 10 years, but in 2016, Apple was the best company to have been in that position.
Sundar Pichai at the time did support Apple’s decision, but with a set of tweets and not a formal letter:
The long wait for Google's response, and the extremely careful wording of Pichai's statements, hint at the difficult position Google now finds itself in with this issue. You can almost hear the PR and legal departments laboring for hours on whether and how to respond.
In the end, Google chose to put out a statement in a series of semi-formal tweets from a top executive, rather than releasing an official press release, blog post or open letter similar to Cook's. Likewise, Microsoft, Facebook and other technology giants mostly stayed quiet throughout that first day and let an independent coalition they belong to speak on their behalf.
It wasn't until more than 24 hours later that Facebook and Twitter put out statements of their own -- and Twitter, like Google, only did so through its CEO's Twitter account.
Sundar’s Tweets:
1/5 Important post by @tim_cook. Forcing companies to enable hacking could compromise users’ privacy
2/5 We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism
3/5 We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders
4/5 But that’s wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent
5/5 Looking forward to a thoughtful and open discussion on this important issue
Apple’s multitude of major products under Tim Cook, such as the Apple Watch, AirPods, and even Apple Silicon have changed the game, but none of that would matter if we didn’t have encryption, privacy, and security.