The past couple of days have led to an escalation on the debate on government powers and encryption. A judge ordered Apple to help the FBI unlock the iPhone belonging to one of the San Bernardino shooters. In response, Tim Cook, CEO of Apple, wrote an open letter stating that,
the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Reading up on the details that are available, it looks like Cook is technically wrong (some online commentators have gone on to label his letter as “bombastic”). This artechnica.com article gives an excellent explanation as to why the FBI is not asking for an encryption backdoor, which is what a lot of people are assuming the latest ruckus is about. Rather, the FBI is asking that Apple get rid of the rate-limiting and the data wipe features found on iPhones. Apparently, the process or software for doing so does not currently exist – but, the implications is, only because Apple won’t do it.
Rate-limiting is a general term where one controls how often something can be done. Twitter, for example, has a “per hour limit” for engaging their API. Same goes for Google Maps and their API – you can only look up so many locations before Google cuts you off. When it comes to iPhones, you’ve probably noticed that after four wrong passwords, you have to wait a minute before you can punch in your password again. This waiting period increases with subsequent wrong tries.
The data wipe, assuming it’s turned on, will delete an iPhone’s entire contents after the wrong password is punched in ten times. It’s similar to what AlertBoot does for their laptop full disk encryption: delete the encryption key, making the contents of a device impossible to recover.
With these two security features removed, the FBI can go about business as usual when faced with encrypted data and an uncooperative suspect: brute force the password – that is, start from 0 and go all the way to 9999. For PINs, that is. For proper passwords, it becomes much more complicated but the concept is the same.
Why isn’t this an encryption backdoor? Because it doesn’t affect the encryption itself. Fort Knox’s gold bouillon vaults do not become less secure just because you call off the two guys guarding it – but, the overall security does.
And in that sense, Tim Cook is not so off the mark in calling the FBI’s requests a backdoor. It’s just another way of saying that the FBI has asked for Apple to compromise the security of their products.
Incidentally, there are reports – educated guesses, really – that what the FBI is asking for wouldn’t work on iPhone 5s models and newer. (The San Bernardino shooter’s smartphone is an iPhone 5c). But the heart of the matter, as many security professionals have pointed out, is not backdoors or the rate limit or the data wipe.
The courts ordering Apple to help out the FBI creates a precedent that is troubling, not only in the US but abroad as well. These have been covered here and there:
- Apple is “being forced to become an agent of law enforcement” (nytimes.com). Basically, the FBI is marshaling Apple’s resources for their own ends.
- The existence of a tool that creates a backdoor means other countries will compel companies for the same. Or perhaps even more. We’ve already seen this happen with Blackberry in Pakistan.
- The government is trying to use the All Writs Act as catchall legislation. There’s a reason why general warrants are not allowed in the US. Hint: it was one of the reasons for the 13 colonies revolting.
There are other concerns as well, which are as troublesome but apolitical. Apple could lose clients if they carry out the order. A big part of the “Apple halo” comes from the fact that their systems are much more secure than the competitions’. The software Apple creates for the FBI could eventually make it out into the public – remember, hacks of government networks is old news.
You Gotta Hand It to the FBI
Come what may, you’ve got to admit that the FBI is making great legal inroads into the nexus of the digital and physical. I was looking over the Jay Michaud case from earlier this month, which involved a Washington state public school employee arrested for possessing child porn. The FBI ultimately tracked him down by hacking his Tor browser and getting his real IP address. Michaud’s lawyers unsuccessfully argued that the warrant wasn’t valid.
What strikes me most about the Michaud case is how it compares to an earlier, similar case from 2013. When the two cases are compared, the underlying challenges to the FBI are nearly identical. Yet, in the latter the judges wouldn’t sign off on the warrant; in Michaud, not only was the warrant granted, a judge concluded that everything was cool after everything was said and done (well, not everything. He noted that there were some technical violations; but when everything was taken as a whole, the FBI wasn’t in the wrong).
Without going into details, in the 2013 case, the FBI was asking for pretty much all data inside a computer plus the ability to turn on a computer’s video camera without knowing where it was or who was using it. It was a ham-fisted overreach and egregious privacy intrusion if there ever was one.
The FBI got smart. In the Michaud case, they asked for a warrant to hack any computers that accessed a particular server. Only people who logged in and clicked on certain hyperlinks were infected with malware – clicking specific links was the trigger. The FBI collected IP addresses and other data associated with a suspect’s computer. As far as I can tell, anything that would traditionally be deemed personal was not collected. They narrowed the possibility of intrusion.
Likewise in the Apple situation. Instead of asking for an actual backdoor, the FBI asks for nothing more than the ability to do what they’ve always done. It’s less of an uphill battle this way, and if they win, they’ll have a foothold from where to mount other attacks.
It could also turn out to be an empty victory in the long run. The existence of the internet ensures that whatever measures law enforcement may enact today, countermeasures will spring up tomorrow.
Say Apple loses in court and the FBI gets what they wish. An easy way to counter a weakened iPhone is to use a proper password on it. After all, the encryption itself is not compromised.
If the law is changed so that a backdoor to encryption is required in the US, one could get a foreign phone. Or use an app (developed overseas and potentially outside the legal reach of the US) that encrypts the contents of a smartphone. Apple, with its control issues, wouldn’t let that happen? Switch to Android. Or jailbreak your phone. Or use a dumb phone. Or engage in misinformation; plant red herrings in your smartphone.
In the meanwhile, ordinary people would see their own personal privacy and data security eroded.
Related Articles and Sites: