Emory Healthcare announced that the theft of 10 computer disks has affected approximately 315,000 people who received treatment at Emory University Hospital, Emory University Hospital Midtown (formerly Crawford Long Hospital), and the Emory Clinic Ambulatory Surgery Center. It wasn’t disclosed whether the information was protected with data encryption software like AlertBoot.
However, considering that Emory is most probably a HIPAA covered-entity, and that HIPAA regulations require the public announcement of PHI breaches involving more than 500 people, especially if encryption software is not used to secure the data, it appears that encryption was not used.
17 Years’ Worth of Data, Including SSNs for 228,000 People
If you have computer disks that store sensitive patient PHI data that your company has collected over a period of 15+ years, perhaps some due diligence should be paid to the use of medical info encryption software. I mean, HIPAA does provide a carrot for using encryption to protect PHI.
According to various sources:
10 computer disks that contained backup data went missing from storage at Emory Hospital between February 7 and February 20, 2012
Approximately 315,000 people are affected by this latest PHI data breach
Information on the patients were collected between September 1990 and April 2007
Information included patients’ names, dates of surgery, diagnoses, surgical procedure, device implant information, and surgeons’ names
Approximately 228,000 patients also had their Social Security numbers breached
Updated: The CEO admitted that “the discs were not stored according to protocol.” (ajc.com)
Also, several sources are reporting that the disks were backups. It sounds like Emory is pretty confident that the breach will be contained because “the data and discs are not readable on a computer and are only compatible with a system the group no longer uses” (patch.com). The system was deactivated in 2007 (11alive.com).
At least one site (ajc.com) is reporting that “discs had been stored in an office” although most are reporting that the data was taken from a “storage location.” The latter sounds a little more secure than the former, eh?
Questions, Questions, Questions
I’ve taken a look at the HIPAA Wall of Shame, and it looks like Emory Healthcare will rank #15 out of all data breaches reported to the OCR since 2009, indicating that the incident is not only a major data breach but a historic one as well.
Which leaves one wondering “what were they thinking?” In fact, the story puzzles me in many ways:
Were these disks or discs? The former generally tends to mean any type of digital media (floppy disks, internal hard disks, external hard disks, etc.) whereas the latter is used for CDs and DVDs. Emory’s own press release uses “discs” and will assume it was, but I’ve seen it reported otherwise. (Personally, I can’t advise someone to use hard disks as a long-term backup solution, but I’ve seen it happen so I’m left wondering).
Why weren’t they using encryption on the backed up data?
How secure was this office / storage location? This ties in to the above bullet on “encrypted data.” While HIPAA does not require the use of encryption, it does require the data be reasonably secured. It wasn’t. Why?
Why is the information not readable?
Regarding the last question, I get the feeling that the information may have been compressed — as backup data usually is — before being burned to a CD. The easiest way to protect this data would have been to encrypt the information after it was compressed but before it was burned to a disc. (Always compress and then encrypt, as opposed to the other way around. Since encryption randomizes data, doing the encryption first means you won’t get efficient data compression later.)
A further criticism: they got rid of the system that is used to read the data, but they did not get rid of the data? Why?
Update: I found answers to a couple of the questions above:
The information on the discs was not encrypted because it was associated with such an outdated system, Fox said. Encryption would have made viewing the data more difficult. But Fox said the information on the discs could not be easily viewed without the software needed to read it, which was not on the discs. [ajc.com, my emphases]
At the risk of sounding snarky, encryption is meant to make viewing data more difficult. Yeah, I know what the CEO meant. Still, there isn’t any merit to the given explanation. After all, it made viewing the data difficult and not impossible. It’s a system that’s been deactivated: nothing about accessing its data is meant to be easy.
Which brings us to the supposed “deactivated system” that is no longer used:
The investigation has determined that the discs were removed sometime between February 7, 2012, and February 20, 2012. They contained data files from an obsolete software system that was deactivated in 2007. This deactivated system was accessed very infrequently and only as requested by either patients or their physicians. The last time data were accessed was in 2010. [emory.edu]
Arguably, this is not a deactivated system. It most certainly is not “not used anymore.” It’s more like it’s on stand-by. Regardless, it helps explain why the discs were kept around even as the system was disabled and “no longer being used.”
This is possibly the worst risk-reward scenario ever: the reward of a little bit of convenience for one IT guy restoring the data, once in a while (possibly every five years or whatever), against the risk of breaching the information of over 300,000 people (who’ll probably be irate) and possibly HIPAA fines and (definitely) a HIPAA breach investigation.
The HIPAA Breach Angle
The case also leaves me wondering how the BlueCross BlueShield of Tennessee settlement applies in this case. The BCBS data breach involved over 1 million people and a $1.5 million penalty. If we were to assign about $1 per record breached, Emory could be facing a penalty of $315,000.
The fact that the information was not easily readable wouldn’t be a factor, I think. For starters, neither was BCBS’s data, since it was mostly in audio format.
Plus, so what if the data is not easily accessible? Someone took CDs full of data from a storage location that, arguably, was used to store PHI. It stands to reason that the motive behind the act was the patient data.
Related Articles and Sites: