Can the government force private firms to design their products to conform to the needs of law enforcement? And if it does have this power, how much can companies really do to help? These questions are in play in the current legal case between Apple and the FBI over access to the iPhone issued to dead terrorist Syed Rizwan Farook by his employer, the San Bernardino County Health Department.
It appears that the courts will not resolve these questions in the immediate case. After maintaining that only Apple can penetrate the iPhone, the FBI has suspended the case because a third party has offered assistance. This is embarrassing to Apple because it suggests the iPhone isn’t as secure as it was believed to be. But the (possible) vulnerability of the iPhone is also perfectly consistent with the way security and privacy work in the computer realm.
Engineers typically design products to be as secure as they can be, consistent with cost and ease-of-use. There are many ways to strike this balance, but it’s impossible to design computing devices to be absolutely secure. They all depend on software, and software always has bugs and vulnerabilities.
Device manufacturers such as Apple, Google and Microsoft are also not in a position to provide law enforcement or anyone else with tools that can circumvent all the forms of encryption available to criminals. As a recent report by the Open Technology Institute (“An Illustrative Inventory of Widely Available Encryption Applications”) shows, there are apps for encryption that have been developed independently of the device and operating system manufacturers.
Device manufacturers are no more helpful to law enforcement in breaking app-based encryption than are the kids who run the local lemonade stand. Moreover, many crypto apps are developed outside the United States, so any attempt by our lawmakers to restrict their use is bound to fail.
The only option for law enforcement is to up its game by developing the skills that hackers use to find and penetrate the vulnerabilities in the apps themselves. But as long as the FBI can force firms such as Apple to do its job for it, the FBI lacks incentives to develop these skills.
The nation has confronted these questions before. During the Clinton administration, there was a proposal in Congress to require manufacturers to build backdoors into crypto chips that would have been accessible to law enforcement. This proposal — known as the Clipper Chip — was ultimately withdrawn as lawmakers were made aware of its vulnerabilities. But we seem doomed to revisit these questions every generation.
Smart people such as former CIA and NSA director Michael Hayden have pointed out that the nation’s interests are aligned with those of the computer manufacturers who wish to market secure devices. We are only as secure as a nation as we each are in our sphere of personal privacy. Any legal requirement for backdoors or extraordinary cooperation with the state security apparatus makes each of us less secure.
We keep myriad details of our personal lives on our smartphones, and firms maintain vast troves of confidential and trade secret information on their computers. Each of us possesses personal information that we don’t want falling into the hands of rivals, enemies and mischief-makers such as Edward Snowden and the pesky kids who are members of the annoying “Anonymous” organization.
Government should expend more resources in keeping private information secure and fewer in covering up its shortcomings. While the FBI may hope to deflect blame by making Apple look bad, the agency’s erratic behavior in the Farook case casts doubt on its competence.
Richard Bennett is a visiting fellow at the American Enterprise Institute’s Center for Internet, Communications, and Technology Policy and co-inventor of Wi-Fi. Thinking of submitting an op-ed to the Washington Examiner? Be sure to read our guidelines on submissions.