The CEO of the Massachusetts eHealth Collaborative, Micky Tripathi, recounts the eight lessons he learned when his company was involved in a data breach when a laptop computer was stolen. It all stemmed from the fact that a laptop, which was not protected with the likes of AlertBoot hard drive encryption, was stolen.
First Hand Account – An Excellent and Insightful Read
Tripathi submitted a first-hand account of his thoughts and actions to histalkpractice.com. He starts off by noting that most might find the “details fascinating… because you realize through hard experience that protecting privacy and security is about incredible attention to the small stuff.”
In keeping with that statement, he has penned a very, very long (but extremely worthwhile) article with lots of details. If you’re into Cliffnotes, I’d suggest govinforsecurity.com‘s summary (and I’d suggest reading the original article over that).
My own concise summary (just the facts, ma’am): a laptop computer was stolen from an employee’s car while the employee was having lunch. The breach affected approximately 14,314 patients (out of which approximately 1,000 had to be notified under the “significant risk of harm” clause, which is still in effect under the HITECH Interim Final Rule), and required nearly $300,000 to resolve. Security software was implemented; however, encryption software was not one of them.
As an “implementation services company” they normally wouldn’t have patient data on their machines, except that they also have to deal with what Tripathi termed “kick-outs,” patient information that was rejected by a system. The company, as a consultancy, helps clients figure out why the data is getting kicked out. This means patient data is transferred to their machines. The rest, as they say, is history.
Unsurprisingly, his #2 lesson learned is “assume that your portable devices contain sensitive information.” This assumption is often more correct than it is wrong.
As Tripathi has himself noted, the company wouldn’t have had to deal with the situation had the computer been encrypted. Certainly, the odds of some random thief accessing his data were marginal at best, with the security software that was already used. Regardless:
And yet … the files were no longer in our control and, without encryption, were indisputably vulnerable. I’d heard the term “my knees weakened” before, but had never experienced it myself … up until that moment, that is.
Without encryption, data is indisputably vulnerable. That’s why most state, federal, and international law will grant safe harbor if encryption is used — if they do grant them. Exceptions are rarely made for other data security solutions, and when they are, they tend to be dropped later in favor of encryption.
You know what’s really frustrating to me? This:
The bad news kept on coming. In April 2010, we had instituted a company-wide policy requiring encryption of any files containing patient information. If the laptop or the files had been appropriately encrypted, this theft would not have been a breach issue. Turns out that we had been shopping around for whole disk encryption options to reinforce our security policy, but regrettably we hadn’t yet implemented a solution at the time of this incident.
Cases like these, where a data breach occurs while you’re considering options, are not unusual. But still reviewing options 21 months later? (The breach occurred on December 2011). Well, that’s a bit unusual.
Tripathi sounds like a very smart, conscientious guy, so what gives? My guess is that he failed the way most people fail when it comes to such issues: out of sight, out of mind. He himself notes that he doesn’t deal with “practice-level data” (read: protected health information), so, my guess is that he just assumed there must have been encryption in place for any employees who did deal with practice-level data on a day-to-day basis. After all, they began looking into it around April 2010. Why would someone assume encryption was not being used nearly two years after you started looking for something?