A recent court order against Apple compels the company to help the FBI gain access to a locked iPhone 5C used by San Bernardino terrorist Syed Farook.
While Apple has helped law enforcement access data on other iPhones on at least 70 other occasions, new security protections make it increasingly difficult to do so with newer iPhones like Farook’s. This is because Apple, in order to strengthen the security of its devices, designed new iPhones with encryption that no one can break (including Apple itself). This includes passcodes that only the user knows.
New iPhones have features that work to foil thieves and cybercriminals, such as making a thief wait increasingly longer time between failed passcode attempts and an auto-erase function that can wipe the device clean after 10 failed attempts.
The court order would require Apple to create a new program, which Apple says does not exist, to be uploaded to the iPhone to disable the auto-erase function and other passcode protections.
This means that the FBI could then try every passcode combination (1111, 1112, 1113, etc.) in rapid succession until it guesses the correct passcode, a process known as “brute-forcing” the code.
The ‘Going Dark’ Problem
Encryption is a great thing for ordinary consumers. Everything on your phone (emails, texts, health or credit information, photos, etc.) is secure on an iPhone unless your passcode is used.
But this strong encryption can also be used for ill. Criminals and terrorists have the same protections that innocent consumers benefit from, and they use these protections to shield their information and communications from law enforcement and intelligence agencies looking to stop these bad actors. This is part of the so called “going dark” problem. This is why the Department of Justice has requested that Apple be forced to help the FBI access the Farook’s iPhone.
If only it were this simple. Unfortunately, there are several unintended consequences of such a policy.
First, if the U.S. can force Apple to do this, other countries, including less friendly countries, will start demanding that phone companies also provide them with a way around the encryption, and this workaround will not be used for good in some countries.
Second, what happens if this new passcode-cracking software falls into the wrong hands? This software (and subsequent versions for newer iPhones or other devices) could be used to disable protections on many devices. Millions of devices go from being very secure to very vulnerable. Consumer cybersecurity and privacy would take a huge blow, as would the reputation of Apple and other tech companies.
This leads to the third side effect—an encryption arms race. If Apple creates this program for the FBI that breaks down passcode protections, nothing stops Apple from making those protections even harder or maybe even impossible to remove in their next update to their operating system or in the next iPhone—indeed, the passcode delay protection is seemingly impossible to remove on the newest Apple products. Or alternatively, bad guys could start using third-party apps like Telegram that encrypt their communication, and the companies making these encryption programs are unlikely to help the FBI, especially since many are foreign companies.
(As a quick aside, some have pointed out that Apple isn’t always being consistent or noble in what it is arguing. While this may be the case, it does not alter the concerns laid out here.)
Moving forward, Apple is challenging the order, which was worded in such a way that it is clear the judge recognized how controversial this issue is and all but told Apple that it should appeal.
What Should We Do?
While some are quick to bash the FBI or Apple, it is important to realize that two important security priorities are being debated here. Essentially, we want to stop terrorists, but we also want Americans’ data to be secure.
At The Heritage Foundation, we catalog all the Islamist terror plots and attacks against the U.S. homeland. Together with threat assessments developed by the U.S. intelligence community noting the severity of the threat to the U.S. homeland, it is clear that stopping terrorism is and must be a priority.
But since there are major negative consequences to current efforts to bypass or backdoor through encryption, Congress should not support such efforts unless a solution can be developed that gives law enforcement access to information while not introducing serious vulnerabilities to Americans’ computers and phones.
It is not clear that this is technically feasible, but Congress, the technology community, and the administration should have an honest conversation about the security risks of special access and workarounds, so that everyone is working from the same set of technical facts.
From that starting point, Congress should look closely at different alternatives for how law enforcement might safely gain access to encrypted data. While the United States should set a high bar for special access or a workaround, we should also provide law enforcement and the intelligence community with all other lawful tools to track and prevent threats to the United States.
David Inserra specializes in cyber and homeland security policy, including protection of critical infrastructure, as policy analyst in The Heritage Foundation’s Allison Center for Foreign Policy Studies. Copyright The Daily Signal. This article was previously published on The Daily Signal.
Views expressed in this article are the opinions of the author and do not necessarily reflect the views of The Epoch Times.