Here’s Why All Data Should Be Encrypted – Regardless Of What The Official Policies Are.

I’ve been revisiting some security breach cases from earlier this year, and one of the more confounding ones is the data breach that happened at the VA in Birmingham, Alabama.  A hard drive, with some of the files encrypted but most of them not, was stolen from the premises.  The VA Office Inspector General released a report in June, so I’ve been reading up on their findings and conclusions.


This particular case stands out for a number of reasons. 


First, the hard drive was in the premises of the VA.  The hard drive in question was found missing and subsequent investigations showed that there was no forcing into the premises or into the safe box where the drive was supposed to be secured.  It almost sounds like an inside job.  The FBI and other investigators definitely tried to assess the possibility: for an eye-popping account of what lengths they went to, take a look at page 22 of the report (link:  After reading this, I think most people would curtail their diatribes of the government not taking such cases seriously.  As to whether it was an inside job, who knows? To this day no one’s been arrested.


Second, the employee who was directly responsible for the data on the hard disk lied to investigators about the contents of the computer and deleted files on his computer in an effort to cover up the extent of the damage and, I suppose, the fact that he was lying.  The stolen drive was being used as a backup to the computer he was using.  I guess he didn’t realize the government would come out in full force, including forensics experts from the FBI.  They confronted the IT guy with evidence, and he capitulated and confessed to deleting and encrypting files in a moment of panic.


Third, subsequent investigations showed that the employee shouldn’t have had access to the files to begin with.  The employee was an IT specialist, although I can’t find any details on what his actual job was.  Either way, it’s no wonder the guy lied: the extent of the damage could potentially affect over 250,000 veterans as well as 1.3 million medical providers.  And it seems the IT specialist knew it.  He deleted the incriminating files the same afternoon he reported the drive missing.


Of course, the fact that the drive is missing is not as big an issue for those affected as the fact that the data is “somewhere out there” (the government is probably focused on how the drive got stolen, though).  There had been other VA security breaches before, so why wasn’t the hard drive encrypted?  It was not due to a lack of policy because the government already had issued directives to encrypt portable devices with sensitive data (VISN 7 Automated Information System Operational Security Policy Memorandum 10N7-115, August 7, 2006). 


Apparently, the same IT guy who’s in trouble told the person in charge that the VA had not approved encryption software for external drives (it’s a long story, but it sounds like the regulations and directives in place were nebulous at the time).  And his supervisor decided it was okay not to encrypt the drive because it was not to be taken out of the premises, as per the regulations in place (We all know that regulations are followed without exceptions, right?)  How could they not consider theft an issue when the only barriers to the office were the front door and the office door?  There were other mistakes made as well.  A slew of other regulations were broken, including giving the IT specialist patient data that he shouldn’t have had access to.


Could AlertBoot have helped prevent this scenario?  Yes.


There a number of ways, but to begin with, port control might have been useful, and might have prevented this entire fiasco.  Port control allows an administrator to specify which devices can connect to computers; whether they have read and/or write ability; and whether a person has the authorization to connect the device, including USB connected ones.  Considering that the external hard drive that is missing was purchased independently by the department, higher-level administrators would have been contacted to ensure access to the drive from their government-issued laptops; otherwise, the drives wouldn’t work.  And at that point, if the administrators knew what they were doing, they could have devised other solutions.  Based on the report, it sounds like administrators would have picked up on the security issue.


Aside from the above, device encryption would’ve come in handy, since the entire hard drive would have been encrypted and secured with a password; this would have made the drive compliant with policies calling for full drive encryption for portable devices.  Also, the data would have been unreadable by the perp or perps who removed the drive from the premises.  Plus, there’s ways to disable the password and lock people out from the device in the event that the drive is not recovered (and was taken by someone who knew the password).


Content encryption, the encryption of individual files as opposed to the encryption of the entire device itself, would also have been necessary to make sure that the personnel involved in this case were not in violation of the HIPPA privacy rules. Also, it works as a risk mitigator since there’s always a chance that the files might be sent to someone without authorization to view them, or that it might be copied to a non-secure device.


On a remotely related note, this incident also shows the dire straits employees might go to during a security breach.  That’s probably why most security experts recommend that IT administrators not be in charge of auditing compliance: there’s always the temptation to hide the evidence if something goes wrong.  A separate body should be in charge of reviewing audits and should work with IT administrators to develop a course of action based on the audits.

Comments (0)

Let us know what you think