Every time consumers buy the latest iPhone model or update their devices with the latest iOS software, they become more insulated from attacks by any entity wishing to access their data. The user's passcode (a sequence of numbers or alphanumerics) or Touch ID fingerprint sensor (the home button) are used in conjunction with software and hardware to create a mathematical lock on the data on the device. With every upgrade, the lock is only getting stronger. Yet as the security on the iPhone better protects users from criminals, it also excels at keeping law enforcement from accessing the data.
This security evolution took center stage last week after a federal court ordered Apple to help the FBI unlock data on an iPhone 5c used by a suspected attacker in the mass shooting in San Bernardino, California. The older iPhone model is less secure than newer models, and it seems like it should be reasonably easy for Apple to hack into the device, but the company is resisting to avoid setting a precedent that would likely require it to hack into more devices.
Apple and the government are at loggerheads over the issue, but it won't really matter over the long term. Any precedent requiring Apple to hack its own devices will likely only last as long as Apple can hack its own devices. The reality is that iPhone security will only get stronger. Absent legislation, law enforcement and criminals alike will have little recourse. The real debate is about whether society wants legislation that weakens iPhone security for law enforcement, and if so, what that legislation should look like. The FBI has pushed for this approach, but the response from the White House and Congress has been tepid. As a society, we will have to weigh the costs and benefits. Besides, what may work in the United States may not work overseas, where Apple does two-thirds of its business.
When it comes to securing the iPhone, two important engineering axioms are at work: It is extremely difficult to secure a device from an attacker who has physical access to it, and what's good for security isn't necessarily good for modifiability.
Apple has gone to great lengths to overcome these engineering challenges. Advances over the last few years have not been contained solely within its iOS mobile operating system but include holistic design changes to the phone's architecture. During manufacturing, Apple lays down its so-called “chain of trust.” The chain of trust requires that chips within the device be embedded with information that cannot be changed — and that iOS will depend on to function. When the phone is powered on, each step along the way is checked for any changes that could indicate that the device was tampered with. As Apple notes, the chain of trust ensures that “the lowest levels of software are not tampered with and allows iOS to run only on validated Apple devices.”
Security, Meet Modifiability
If a device is well-secured, it's hard to change its components because tampering could compromise data. The ability to change components means the device is modifiable. The friction between security and modifiability came to the fore Feb. 5 when Seattle-based law firm Pfau Cochran Vertetis Amala brought a class-action lawsuit against Apple over authorized third-party service providers. The firm said:
“We hope to find out why Apple implements a policy where end users aren't free to choose someone other than Apple to repair their devices. We believe that Apple may be intentionally forcing users to use their repair services, which cost much more than most third-party repair shops. Where you could get your screen replaced by a neighborhood repair facility for $50-80, Apple charges $129 or more. There is incentive for Apple to keep end users from finding alternative methods to fix their products.”
The iPhone users' perspective was nicely summed up by writer Adam Minter on Bloomberg View:
“Imagine if Ford remotely disabled the engine on your new F-150 pickup because you chose to have the door locks fixed at a corner garage rather than a dealership. Sound absurd? Not if you're Apple. Since 2014, the world's most profitable smartphone company has — without warning — permanently disabled some iPhones that had their home buttons replaced by repair shops in the course of fixing a shattered screen. Phones that underwent the same repair at Apple service centers, meanwhile, have continued working just fine. The message seems clear, at least to the multibillion-dollar independent repair industry: Your phone is yours until you decide to get it fixed. Then it's Apple's.”
A primary motivation for this lawsuit is that iPhone owners with broken Touch IDs or screens were visiting third-party repair shops to have the problem fixed. When the owners later updated their iPhone iOS software, their devices stopped working and displayed a mysterious “Error 53” code. The reason for this is obvious — replacing the Touch ID breaks the iPhone's “chain of trust,” so Apple's response was to completely disable the user's tampered iPhone.
For those iPhone owners without a nearby Apple store or authorized service provider — there are 481 stores in 18 countries — the only option was to purchase a new phone.
Apple's increased iPhone security means it must closely control how modifiable the device is. The company must assume that any tampering, particularly with security critical components such as the Touch ID, can potentially compromise data on the device.
In the middle of the legal battle between Apple and the FBI, an interesting development occurred. On Feb. 18, Apple issued a patched version of iOS 9.2.1 to restore devices rendered useless by “Error 53” while disabling Touch ID's fingerprint capability. The ID can still be used as a home button but not as a fingerprint reader to unlock the device. One speculation about the response from Apple was that the company may have been wary of being seen as hypocritical, arguing for personal privacy through security while simultaneously using security as a means of making users pay high prices for repairs, which could potentially force them to buy new iPhones. Apple has also offered to pay for out-of-warranty device replacement “based on an error 53 issue.”
The auto industry has long been dealing with the issue of “right to repair,” which would allow car owners to get their vehicles repaired wherever they want, not only at a car dealer. In 2014, an agreement was reached that all auto companies would make their diagnostic codes and repair data available in a common format by the 2018 model year, according to Automotive News. In return, lobbying groups for repair shops and parts retailers are supposed to refrain from pursuing state-by-state legislation.
Sharing diagnostic codes is one thing, but sharing encryption codes? Impossible.
Modifiability, Meet Security
As manufacturers continue to connect devices to the Internet, security that has mostly been an afterthought is improving. The same privacy protections Apple is now encumbered with regarding the iPhone will organically extend to other devices that also collect personal data, such as automobiles, refrigerators, televisions and Internet routers. Will all of these device manufacturers (many of them not based in the U.S.) require their own version of the Apple Store or will the right to repair win out? Can a balance be struck? Striking a balance relying on technology alone is challenging, as Apple is undoubtedly learning.
Bradley Wilson is an information systems analyst at the nonprofit, nonpartisan RAND Corporation.
This commentary originally appeared on U.S. News & World Report on February 25, 2016. Commentary gives RAND researchers a platform to convey insights based on their professional expertise and often on their peer-reviewed research and analysis.