The Gap, the San Francisco-based clothes retailer, has just issued a press release that a laptop containing the information of 800,000 job applicants has been stolen. In all fairness, this is not the fault of The Gap, but one of the vendors that they hired to manage the job applicant data.
Just when I thought the week was winding down. (Hmmmm… companies tend to issue bad news before the weekend so that their stock won’t get hit. Maybe I’m just being too cynical? Although, TJX took a heavy hit in the stock market when they released the news that customers’ credit cards were compromised, if I’m not mistaken.)
Speaking of which, since The Gap hasn’t really released too much information as of yet (including who exactly lost the laptop), a lot of news agencies and bloggers are trying to tie this incident to the TJX news from earlier this week. I guess the point in common is the fact that both companies sell clothes, and there was a lack of encryption.
Except, the situations are completely different. Earlier this week (and I have blogged about this) the Canadian government released their findings on the TJX fiasco. Among their findings was that TJX was not using the proper wireless encryption standard, WAP, but an easily broken encryption, WEP. Essentially, some guy was reading the data in the airwaves. There was also criticism that TJX was collecting personal information above and beyond what was necessary, and that they were keeping it stored on their servers for an inordinately long period. I don’t recall any criticisms regarding not encrypting data.
With The Gap, the vendor had its laptop stolen. And it sounds like it was from their offices. Per the agreement with the vendor, the laptop was supposed to be encrypted. For an as-of-yet-unknown reason, said laptop was not encrypted.
I’ve already read several commentaries where security professionals (real and wannabes, the latter probably quoting the former) ask the question “what was the data doing on a laptop?” I find this question to be reactionary and irrelevant: let’s face it, more and more people and companies are migrating to laptops as permanent or semi-permanent “desktop machines.” If they’re actually implying that desktops are somehow physically more secure machines, I can tell you from personal experience that it’s not. In fact, I’ve seen instances where a very security-minded company issues laptops as workplace computers, and these are locked in strongboxes at the end of the day. Security-wise, I’ll take that over unsecured desktops any day. Plus, you should take a look at desktop computers nowadays. The size of one of my co-worker’s Dell machine is tiny; a 6-year old could steal that after unhooking the LCD monitor and pulling the plugs on the USB keyboard and mouse. I’ve seen hamburgers bigger than that Dell. My guess is that any pros were quoted out of context. Laptops might imply mobility, but any machine is mobile. It might be harder to steal an IBM mainframe, but is it really impossible? I, for one, remember the case of a pair of drunk guys stealing an ATM machine in its entirety—it weighed over two tons, if I’m not mistaken.
The correct question is “why was there no encryption?” which is what the pros tend to ask after asking the first question above. In fact, I believe this question doesn’t get asked enough for laptops, desktops, and other devices that are not physically chained down to the floor.
The fact that The Gap had an agreement with the vendor requiring their data to be encrypted implies, at least to me, that The Gap was doing its homework. They also must have had an auditing body as well, which is also part of the homework. And seeing how the vendor is still handling data for The Gap (no word on whether ties would be severed with the vendor), I have to assume they must have passed these audits. So, why? How?
Time will tell (I wish I could trademark those words). But if my research is any clue, it’s probably because most encryption programs are challenging to implement and enforce enterprise-wide, and more importantly, to maintain it effectively on an on-going basis. The impact on workflow can be more than substantial once a security measure is implemented.
The beauty of AlertBoot is that laptop encryption (or any other type of device encryption) is actually a very easy process to implement. Plus, you get a comprehensive report suite to audit the encryption status of all your devices. And if you buy new devices, the process of implementing encryption takes a matter of minutes.
My guess, though, is that content encryption for protecting individual files would have been the best solution in this case. In AlertBoot, an administrator can specify that a certain file type always be encrypted; as an end-user creates new files by copying data from one file to another, these are encrypted without further effort on the end-user’s part. And if these files happen to be passed around the office for work-related purposes (which, let’s face it, will always regardless of corporate policy), it won’t matter as much if they end up in an unencrypted laptop: no password means no access to data—which is the idea behind encryption.
The Gap has to alert affected job applicants because it’s the law—even if the third-party vendor screwed up, the data ownership belongs to The Gap. As is the usually the case in such situations, whichever law enforcement agency that was contacted about the theft probably asked the companies involved not to announce specifics of the data breach. However, The Gap uses multiple vendors to handle job applicant data, so I guess they can get away with announcing a breach as quickly as possible (winning points in the customer PR department) while not revealing the third-party’s identity while the investigation continues.