A doctor, in violation of policies at Saint Francis Hospital, copied patient files to an unencrypted USB key which was subsequently lost. While I’d like to blame the doctor for the breach (he is the who caused the breach, after all), I have to question the hospital which only had a policy in writing only. It looks like they needed something like AlertBoot’s own disk encryption, and who’s in a better position to deploy it, over see it, and monitor it?
The hospital, of course.
10-Year Old Data
One of the striking and yet not so unusual aspects of this latest data breach is that the doctor in question didn’t even realize that he had a data breach. The doctor only became of the issue when someone mailed him the missing USB key (question: how did the sender know his address?).
A total of 474 former patients were affected. The patients “had participated in a prenatal and maternity care program a decade ago,” which raises questions on why such data went missing recently. The only logical answer appears to be that the data was being used in some kind of research, which seems to dovetail nicely with the thumb drive retaining names and medical information, but not SSNs, addresses, phone numbers, and babies’ names.
Won’t Happen Again, Technological Solution
A hospital spokeswoman said,
that type of breach won’t occur again because the hospital is considering employing a technology that automatically encrypts data files that are transferred from a hospital’s computer systems to a thumb drive. That initiative should be ready within one month.
We will enforce the proper ways to protect data now. [delawareoneline.com]
Huh? What? They’re going to enforce proper ways to protect data now? Now that they’ve had a data breach? I honestly wonder what prompts people to think in this reactionary manner. It’s like promising not to play with guns after you’ve killed someone. Shouldn’t common sense apply?
Like I said at the beginning, the hospital is in a better position to ensure data security. While I’m not going to say that doctors can’t handle technology (there are those who can and those who can’t), data security is really best served when those who are in the know are in charge of implementing it. This is especially true when it comes to encryption software and HIPAA because there are conditions.
For example, one of the prime drivers behind encryption in the medical arena is not HIPAA, but HITECH Act amendments to it: the offer of safe harbor if PHI is properly encrypted (safe harbor from sending notifications, that is). Mind you, the key word here is “properly.”
Neither HITECH nor HIPAA (nor the HHS, which is charged with implementing the rules) has defined what proper encryption is. Instead, it points covered entities to a couple of publications by the National Institute of Standards and Technology (NIST). In a nutshell, for medical disk encryption, it’s safe to say that nothing weaker than AES-128 or equivalent is currently acceptable.
If encryption that is weaker than the above is used, safe harbor does not apply. With such potential minefields in place, it only makes sense for an administrative body to ensure policies are followed, as opposed to leaving individuals to guarantee compliance.
Related Articles and Sites: