Rethinking Encryption: Former Anti Strong-Encryption FBI Guy Changes His Mind.

A couple of weeks ago, Jim Baker published a long article on why the US needs strong encryption. One could call this position surprising, seeing how Baker was part of the FBI’s 2016 efforts to hamstring Apple’s iPhone encryption, a consequence of the San Bernardino terrorist incident.

Baker does a good job of showing why he’s decided that legally requiring encryption backdoors, or backdoor-like vulnerabilities, is a bad idea. He also, perhaps unwittingly, reveals why the FBI continues to argue for weakening encryption despite the many reasons for not doing so.

#1 There Must Be a Technical Solution

Baker notes, emphasis added:

[Current Attorney General William Barr] rejects the notion that a technical solution to the problem does not exist and, instead, proposes what strikes me as a risk-based model where society accepts increased cybersecurity risks resulting from enhanced government access to communications in exchange for decreased risks to society from other types of threat actors.

For a very long time, AGs have kept repeating a mantra to the effect that “surely there must be some kind of technical solution that allows the installation of encryption backdoors that will allow law enforcement to go after criminals but won’t have a negative impact on society at large.”

Such a solution does not exist. It’d be like arguing “surely there is a knife out there that won’t harm innocent people but will easily cut through the flesh of animals.” Who but an idiot would believe this? And yet, when it comes to encryption and backdoors, plenty of people seem to believe it despite the parallels being exact.

Maybe it’s because encryption is amorphous as compared to physical objects. Or perhaps it’s because encryption is basically advanced math, so it takes a little thinking to understand what it’s doing and what it can (and cannot) do. Regardless, when it comes to encryption, a lot of people don’t seem to “get it.”

And so, if you truly believe in your heart of hearts that there must be some kind of fix that the experts are keeping hidden, then of course you’re going to keep holding out for it. Heck, you may even try to force the issue of backdoors so the experts will be forced to bring out the goods, the fantastic encryption that’s only weakened for the wrong guys.

Whatever that means.

#2 It’s an Acceptable Risk

Then again, perhaps the FBI doesn’t really believe that there is a technical fix. The “risk-based model” that Baker mentions would imply that Barr and others within the Bureau are cognizant of the need to shift how people think about the issue because a technical or technological one is not coming. Why make the argument if a technical fix exists somewhere?

Which brings up the issue: is weakened cybersecurity for all an acceptable trade-off for getting the bad guys? (Keep in mind that there’s no guarantee that the bad guys will be caught).

People engage in trade-offs all the time. One of the leading causes of death in the US is cars. And yet, almost nobody argues for the ban of motorized four-wheeled vehicles. That’s because most people have accepted a trade-off: a limited count of deaths nationwide for the ability where hundreds of millions of people get to go where they need or want to be. We juggle risks and trade-offs every day, even dangerous ones.

Baker, for one, notes that weakened cybersecurity is not acceptable for what the FBI is pursuing. Among other things, he fingers China’s ascendancy both as a global power and a manufacturer of networking equipment, as well as past and current espionage attempts that would be facilitated as the OEM of a lot of computing and communications hardware.

Spying is always an issue (even among allies, China notwithstanding). This is probably why the NSA and the CIA publicly called weakened encryption a bad idea during the Apple vs. FBI circus in 2016. About the only body saying it’s an acceptable risk seems to be the FBI.

When you’re the lone holdout, it tells you something about yourself.

Criminals Have a Choice, Too

The biggest problem with the FBI’s stance could be that, even if they manage to get backdoors (or similar weaknesses) built into legitimate encryption, there’s nothing that forces criminals from using said legitimate yet crippled encryption. Why would they?

This ought to pose a conundrum for the FBI. They argue that “criminals going dark” – the FBI’s poetic way of saying that criminals are using encryption that cannot be broken – is a growing obstacle in resolving cases, some of the most heinous cases society is not willing and shouldn’t be willing to accept. Cases that involve terrorism, child pornography, and organized crime, among others. If data protection software and hardware is preventing the resolution of such cases, strong encryption cannot and should not be.

But what we’ve seen in the news time and time again is that participants in such criminal activities take great pains to protect themselves. If the US mandates the use of backdoors on encryption, it’s pretty obvious that criminals will source their uncrippled cryptographic needs from elsewhere. In fact, there is a ready market that already offers such goods and services. At the end of the day, top-tier criminals don’t enjoy getting pinched any more than the next guy, and they know the full force of the US government is coming after them due to the nature of their criminal activities.

So, the purported raison d’être for weakened encryption – facilitating the apprehension of terrorists, child pornographers, organized crime – falls flat on its face while, and this is by the FBI’s own admission, “society accepts an increased cybersecurity risk.” There’s 400 million people in the US. That’s a lot of societal risk, especially for a tactic that’s bound to be short-lived.

But then again, perhaps the FBI’s endgame lies elsewhere. It also deals with crime that is, shall we say, of the lesser profile variety.  Crime that is perhaps not as heinous as terrorism, or perhaps not heinous at all (art theft, financial crimes) but that still requires resolution. Crime that is not necessarily pre-planned or well thought out but ends up protected by strong encryption anyways. Weakened encryption would be a boon in solving such cases.

If that’s the case, though, why not say it? Is it because the FBI believes that society will not accept an increased cybersecurity risk as a tradeoff for less attention-grabbing crimes? And if it does believe that, should they be attempting to pull what is essentially a bait-and-switch?

Related Articles and Sites:
https://www.lawfareblog.com/rethinking-encryption



Comments (0)


Let us know what you think