Facebook Alerts Possible Data Breach, Unwittingly Interferes With Investigation.

According to morningstar.com, Facebook inadvertently interfered with a terrorist investigation when it recently alerted users that their smartphones had been “hacked by an ‘advanced cyber actor'” via WhatsApp. As it turns out, the hackers were not criminals but part of a governmental “elite surveillance team” that used spyware to track suspects. At least one terrorist stopped using his phone immediately; whether any other users were also being tracked is unknown.

Kept in the Dark, Obligation to Notify

Of course, no one really blames (or at least, should not blame) Facebook for what happened: the company was unaware that an investigation was being conducted. Indeed, you could almost argue that Facebook should be commended for (a) identifying a bug, (b) finding out that it was being exploited by unknown parties, and (c) speedily alerting those who were affected. After all, isn’t this what a data breach notification law demands, essentially? Isn’t this what’s expected of the companies that hold our data?

The three steps above are not just about legal compliance: it’s good business practice, too: Do this over and over again, and you have a product or service that will ensure that people’s privacy and safety are as secure as they can be. Other companies in a similar situation would do well by doing the same.

Collateral damage is expected, of course, if law enforcement decides to surreptitiously piggyback on vulnerabilities. A simple way to ensure continued access is to reveal a vulnerability to companies and alert them how it’s being used. In a sense, it’s like a backdoor, albeit an accidental one. Regardless of the simplicity, governments cannot realistically do this.

Among other reasons, some of the biggest companies have vowed, and make a policy to, patch any vulnerabilities they become aware of. Furthermore, even if tech companies weren’t so resolute in fixing problems affecting privacy, the government must be wary: the more people know about something, the higher the probability of a leak. Hence, it really is in their best interest to keep hidden any vulnerabilities they uncover.

Which is ironic when you consider how often you hear pleas for a known (but secured) backdoor or equivalent (aka, the “not a backdoor” backdoor). Legally mandating a backdoor, whatever you call it or however you decide to dress it, is a known vulnerability made public. If people can find a vulnerability that nobody knows is there, you can imagine how much more impetus there would be to find one that everyone knows exists.

FBI vs. Apple II

In a separate story, the FBI has again asked Apple for access to an encrypted iPhone, according to the New York Times. Observers are noting the parallels from 2016, when the two duked it out all the way to court. The matter remains unresolved because the FBI dropped the lawsuit at the last minute, saying that they had found a different way around a suspect’s iPhone’s encryption.

However, subsequent iPhone models were released with strengthened security, and the FBI is again unable to access a smartphone. The only way in, it is argued (again), is for Apple to do something about it.

The NYT story also notes that, unlike 2016, Apple could be facing greater pressure from an administration that is, ahem, a little more willing to play dirty. Will Apple cave?

A Backdoor is Still a Bad Idea

The argument has been made again and again, even before 2016: a secure backdoor is a fantasy. It cannot exist, mathematically. Hence, creating one is a bad idea, especially when considering the millions upon millions of people who are being protected by encryption. But, let’s forget for a minute this argument that has been beaten to death (but that will take some beatings still, for the foreseeable future).

Let’s take a look at the FBI’s argument that a backdoor is necessary in the face of the “going dark” problem, that law enforcement cannot shine a light on criminals because of encryption. (It should be noted that certain three-letter government agencies, arguably with more experience when it comes to encryption, have argued against installing backdoors).

The FBI claims, sometimes quite pointedly, that Apple and other tech companies are enabling criminals: terrorists, pedophiles, drug dealers, and the like.

And this is true. By refusing to build a backdoor, and by ensuring that devices and apps have the strongest security possible, the FBI and other intelligence agencies are stymied in intelligence-gathering efforts. And, there’s no way to prevent the sale or use of secure devices by criminals. By design or not (one would argue not, based on the sheer number of innocent, law-abiding people who use these products in a legal manner), criminals come out on top. They are enabled.

But, if these same criminals were to know that a backdoor exists, what would happen? Would life actually become easier for law enforcement? Just look at what happened when Facebook sent out a warning about an app being compromised: the terrorist stopped using it. This, despite the fact that it was compromised by an “advanced cyber actor,” a term that does not evoke the FBI, CIA, NSA, or any “Western” intelligence unit. (Indeed, could the terrorist be blamed if, after reading the warning, he thought to himself: advanced cyber actor? Is that a reference to one of us?)

It’s only natural that criminals would not use a compromised communications medium. Consequently, with backdoors, the FBI and others would still have to deal with the “going dark” problem but millions upon millions would have their security, if not compromised, at least undermined – this, in a digital environment that’s becoming ever more dangerous.

Furthermore, it could be argued that Apple and other tech companies are still enabling criminals: What are the chances that they won’t find a way in via the backdoor? (Granted, it would be a different breed of criminal from the ones listed earlier, but how does switching from one class of criminal to another really help society?)

The overall result: the FBI still has its problem but everyone’s security suffers, with criminals potentially being less affected. For example, criminals could opt to build their own encrypted communications app. Or their own communications network or phone company. However, certain branches of the law imply or seem to think believe that, with a backdoor, crime would take a bigger, faster hit than law-abiding people. Where’s the evidence supporting this, though?

Perhaps it’s encrypted and they have problems accessing it.

Related Articles and Sites:
https://www.morningstar.com/news/dow-jones/202001026663/police-tracked-a-terror-suspect-until-his-phone-went-dark-after-a-facebook-warning
https://www.nytimes.com/2020/01/07/technology/apple-fbi-iphone-encryption.html



Comments (0)


Let us know what you think