Attorney General William Barr – the same one who concluded that Special Counsel Robert Mueller’s report on collusion and obstruction of justice “exonerates” the sitting president (in quotations because there’s a wee bit of a controversy on whether the report actually does so; plus Barr never literally uttered the word) – gave a speech on the dangers posed by encryption at the International Conference on Cyber Security.
Like many others before him, he touched on the subject of criminals “going dark” and the need for some kind of legal access to encrypted data. He didn’t come out and say it, but he was implying that a backdoor was necessary.
A Long Tradition (Broken?)
Barr is not the only one that has brought up the issue of encryption being effective, so strong that it impedes investigations. The same criticism has been brought up year after year by other AGs, at the federal and state level, as well as the FBI, other law enforcement officials, and government representatives. It’s been something of an annual tradition since the early 1990s, supposedly.
However, Barr appears to have broken new ground. Per lawfareblog.com:
Barr’s words are remarkable. As far as I can tell, this is the first time that the U.S. Department of Justice has acknowledged that the U.S. government is willing to ask the public to accept a lower level of cybersecurity and a higher degree of risk as the price for exceptional access.
What was it that Barr has said to cause such a reaction? Basically, that having backdoors in encryption is an acceptable risk, especially in certain cases:
The risk [of backdoors], [Barr] said, was acceptable because “we are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications,” and “not talking about protecting the nation’s nuclear launch codes.” [techcrunch.com]
In other words, the argument appears to be something akin to “why would you need unbreakable encryption to secure lolcat memes, the inane e-mailings of your grandmother, and other forms of banality? Unbreakable encryption is for nukes and stuff.” It may be unfair to Barr, but that’s essentially his argument.
It should be noted that Barr’s argument for backdoors excludes the usual scenarios where strong encryption is unequivocally necessary for consumers, even if it does not involve nuclear codes, and thus does away with the usual criticisms of backdoors significantly increasing the risk to society at large. So, your iPhone would have a backdoor (because in the large scale of things, you’re a nobody vs., say, an Attorney General) but your pacemaker (which can be hacked) wouldn’t despite your status as a plebeian. Your laptop’s encryption would have a backdoor but the encryption securing your banking app or your e-commerce site wouldn’t.
For those instances where encryption has been purposefully hamstringed, people’s privacy is protected by the law, the same thing that has protected US citizens since 1785 or thereabouts. If it’s good enough to secure your body, house, and physical belongings, it should be good enough in the digital arena as well.
Of course, many would argue that the law doesn’t quite work as well as advanced encryption does. And, had the US government not found itself crossing boundaries time and time again, perhaps then it would find a more receptive audience to its arguments to “accept a lower level of cybersecurity.”
How disastrous is encryption to law enforcement’s objectives?
Barr gives the example of violent drug cartels using WhatsApp to coordinate illegal narcotics smuggling as well as planning and executing assassinations. The cartels used WhatsApp because it features end-to-end encryption. Without a backdoor, it’s nearly impossible for unrelated parties (like the government) to even get a hint of what’s going on. The authorities’ only hope would be to exploit an undocumented glitch, assuming it exists. Obviously, things would be different if a backdoor existed.
Or so you’d think. The biggest flaw in this logic: it assumes that the criminals will use products with backdoors. But why would they? One supposes that WhatsApp is being used because of its secure communications and not, say, their emoji offerings, with end-to-end encryption viewed as a secondary feature.
Let us assume that WhatsApp kowtows to the government’s wishes and implements a backdoor to their app. Furthermore, assume that this change is kept hidden from the public. The cartels and others would continue using WhatsApp. For a time, the US government would finally be able to get what they want. Crimes would be prevented. People would be arrested. Justice would be served. But for how long?
Your guess is as good as any, but one supposes that it’d be ten years, more or less*. The fact that a secret backdoor exists is going to get out, one way or another. It could be in the patterns that criminals notice (all of a sudden, the cops are very good at thwarting their plans) or implications that show up in the courtroom as evidence is presented, even if parallel construction is used. Maybe even a government whistleblower. Or perhaps someone in the know who’s been bought off and feeds information to the bad guys.
The only logical final outcome would be for criminals to create their own encrypted communications apps. They have the money. They may not necessarily have the skills but they can buy it. And, as the world of ransomware, cryptoexchange hacks, illegal personal data resale, and other forms of digital crimes shows, there are plenty of people who can and will supply said skills.
So, while there could be limited benefits to an encryption backdoor in the short-term, in the long run, consumers will have compromised encryption and criminals will not**, and the Department of Justice will be back to square one, except that they won’t have legitimate tech companies to kick around and point fingers at. Certainly, this is speculation, but it’s closer to certainty than not.
Cartels have already been busted for kidnapping engineers to build their own private telecommunications networks and building submarines to evade the Coast Guard. You think they’d balk at creating an app with end-to-end encryption, assuming that there aren’t a dozen versions of them out in the wild already?
And at that point, what will law enforcement do to pursue their investigations? Probably whatever it is that they did to figure out that WhatsApp was used by criminal cartels to plan drug smuggling and assassination attempts…which the DOJ could not be aware of because encryption allows “going dark” and so law enforcement couldn’t possibly know about it.
Interesting, isn’t it, how they know of it and yet say that encryption prevents them from knowing about it?
1. * For example, Stingrays – devices that route cellphone calls through it – were supposedly used for about 15 years, starting in the early 1990s, before civil rights organizations became wise to it. The PRISM surveillance program’s secrecy lasted about 6 years. Perhaps the maturity of the internet has accelerated forced transparency?
2. ** Of course, this does not mean that all criminals will be using strong encryption. There are plenty of crimes that are spur-of-the-moment or badly planned and law enforcement will benefit from encryption backdoors. However, such crimes generally pale in comparison to terrorist activities, the global trade in illegal narcotics, human trafficking, etc., where the depravity of the crime reflects the depths of one’s purse and thus an unconstrained encrypted communications medium is affordable to criminals. Asking Americans to “accept a lower level of cybersecurity and a higher degree of risk” to solve a case of road-rage gone tragic is bound to be a tough sell, which is why the stories always involve something worthy of a headline… even if, in the long run, the backdoor will feature more prominently in crimes that are humdrum, relatively speaking.↩
Related Articles and Sites: