The HIPAA Final Omnibus Rule incorporated a number of changes that look small on paper but have huge ramifications, such as the striking of the “harm threshold” rule and requiring business associates to know and comply with HIPAA rules. In essence, the changes strengthened the existing HIPAA legislation, closing any loopholes and ambiguities that were open for interpretation and further solidifying the focus on patient data security.
The importance of these changes is easily proved by stories like the one below.
Milwaukee Files HIPAA Complaint Against Dynacare
According to phiprivacy.net, the City Attorney for Milwaukee has filed a statement with the Office of Civil Rights at the Department of Health and Human Services, bringing public attention to the fact that 9,000 employees of the city were involved in a HIPAA data breach. The blame has been assigned to Dynacare Laboratories.
Specifically, an unencrypted USB disk with personal data of Milwaukee employees, and their spouses or domestic partners, was lost when a Dynacare employee’s car was stolen.
The thing is, Milwaukee never signed a contract with Dynacare. Rather, it’s a subcontractor to Froedtert Community Health/Workforce Health. In other words, Dynacare is a business associate (and BAs account for not an insignificant number of breaches).
What Can Covered Entities Do to Check on Business Associates?
As HIPAA rules currently stand, and based on some of the conclusions the Office of Civil Rights at the Department of Health and Human Services has arrived at, it behooves covered entities (CE) to be aware of how their business associates (BA) are approaching ePHI concerns.
For example, it might not be enough to formulate an agreement that forces the BA to use encryption software and protect ePHI on laptops, smartphones, and other electronic devices. CEs should perform due to diligence to see that BAs have stuck by the agreement. But doing so is a tall order.
For who has the extra time and resources to see whether a business partner has really encrypted their laptops? Furthermore, would the BAs send in their laptops to be checked? Or would personnel from the CE’s office go to the BA’s venue to check that laptops are indeed encrypted? What if the associate is in not in the US? Remote sessions could work, but we’ve seen exacting companies that want to slave a hard disk to check whether the encryption is really working or not.
Off-topic: interestingly enough, this “slave and check” move is not paranoid at all. I’ve come across one instance (which I assume is a rarity solely based on the craziness of the story) where someone was looking for an FDE solution that wouldn’t prompt the endpoint user for any type of “password” at all, be it an actual password, a token, a biometric scan, what have you. These people were looking to comply with the letter of the law (FDE must be installed) but not with the spirit of it (can’t be bothered to really protect anything because passwords are an inconvenience).
So, what can a CE or a BA do when it comes to checking or proving that disk encryption is deployed and used in the workplace, and thus ensuring safe harbor from the HIPAA Breach Notification Rule should something go awry?
Use a Report from an Independent 3rd Party
Well, you could run a report. Take AlertBoot for instance.
Reporting is central to the cloud-based endpoint security solution because the deployment and installation of FDE on laptops and MDM on smartphones is done remotely. The lack of direct physical oversight (not that it’s really necessary with AlertBoot) means a substitute method is necessary for ensuring that the deployment and installation processes completed correctly.
Hence, a dedicated reported engine was designed and incorporated into the AlertBoot process. Fully customizable, the report engine’s output can be tailored to show any information that’s necessary. And because it’s accessed over the internet, a BA can easily show a CE (or the OCR) that they are complying with any contractual or legal responsibilities.
Related Articles and Sites: