Is the use of encryption a silver-bullet for HIPAA covered entities that are looking to gain safe harbor from the notification policies found under the HITECH Breach Notification Rule? Generally, yes. There is a caveat, however, as Amedisys’s recent breach notification shows: you must be able to prove that the encrypted data remains secure after the data breach. Otherwise, what’s the use of using HIPAA-grade encryption software for laptops?
Inventory Check Raises Issues
What happened at Amedisys? On March 2nd, the hospice care provider revealed that they were unable to account for 142 encrypted computers and laptops. Which is unusual for a number of reasons:
- That’s a lot of devices. Was the company not doing regular checks, say every 12 months or so? Because that’s a lot of devices to go missing unless audits were pretty rare.
- These devices were encrypted. Although there’s always room for mistakes and paranoia, if the company determined that all of these were encrypted when they went missing, there’s really no reason to notify anyone about it. (It should be noted that Amedisys issues a press release, which means they elected to notify basically everyone who had an internet connection.)
- While the ratio of missing desktop to laptop computers was not given, it’s hard to imagine that it took an inventory check to see whether desktop computers were missing. Even if only one were missing, it tends to raise alarms in a way missing laptops do not.
The company further stated that the following personal information could have been stored on these devices: “name, address, Social Security number, date of birth, Medicare and insurance ID numbers, medical records and other personally identifiable data.”
Amedisys revealed that a total of 6,909 patients were affected. Where did these laptops and desktops go?
As it turns out, these devices were “assigned to Amedisys clinicians and other team members who left the company between 2011 and 2014.” And that’s a problem in many ways.
The last time I checked, computing hardware still costs a bit of money. That these devices were essentially given away when people left employment means either that Amedisys had poor controls or is (was?) a very generous company. (On second though, it could also mean the devices were so subpar technologically that management decided giving them away would be cheaper than collecting them back.)
Also, the software that is installed on these unaccounted-for machines can be costly. For example, the cost of AlertBoot’s full disk encryption is on a “per machine” basis, regardless of how many logins are tied to each one of those machines. Let’s say that Amedisys was using AlertBoot. Unless the licenses are retrieved from missing devices, the company would be footing the bill for machines they no longer had control over. Admittedly, we cannot exclude the possibility that the company was using free software like the recently-deceased TrueCrypt, which would allow such actions to be impact-free from a financial perspective.
The biggest problem, though, and the one that touches on the HIPAA encryption caveat I mentioned at the top of this post, is that the information of patients can be breached despite the use of strong encryption: the clinicians and other team members have the passwords and can access the data.
Attack from Within
One of the rising problems of medical data breaches centers around employees: while most can be trusted, there is that small faction with a bent towards malfeasance. If we can claim that around 2% of employees engage in activities like stealing medical IDs for resale on digital black markets, and that each missing Amedisys device represents one person, then about 3 people could have made use of the fact that they conveniently hold the passwords to encrypted data.
(One way of preventing of such scenarios from occurring, assuming that devices cannot be collected, is to trigger a remote wipe of the data – if the encryption solution has such a capability built-in the way AlertBoot does).