I recently upgraded the OS on my Apple laptop. As I was completing the installation, some questions on the screen asked me if I was connecting to iCloud.
I have been reluctantly storing a lot more of my data to cloud storage systems. I like the cloud because it is convenient; I dislike the cloud for data security concerns. I'd feel better if I could understand exactly what is happening to my data on the way to the cloud, through the cloud, and on the way back. This isn't magic - there are systems involved for routing and parsing the data.
Companies could do a better job describing what they do with our data. (Ok, Box.net does. If there are others, let me know.)
On the screens during the upgrade, Apple wrote that the data would be encrypted and saved/stored. But what should that mean to me? I know my data will be placed on servers, storage, and backup systems along the way.
- Where will this data live?
- How will it get from place to place?
- Will it be encrypted?
- Will it ever be unencrypted?
- Is there a way to hack into this?
I mean, a bunch of celebrities got their photos hacked on iCloud. Why couldn't my info be hacked too?
I think we trust most cloud companies simply because they claim they are the "experts." Most of us don't understand the intricacies of data security and protection, and it can be overwhelming. Rather than trying to learn the information, we will often accept what an "expert" tells us because it is easier - mentally and emotionally - although this approach comes with great risk.
From childhood, we are trained to believe "experts" if a subject is too complicated. However, these experts can often over simplify processes and downplay risks/problems. This happens frequently when corporations try to sell a product. They try to make that product simple and easy to understand, but some key details could be left out of the discussion, which if included, could raise some concerns.
If you don't know what you don't know - you don't ask, which means you still don't know. This means that you don't know enough to even learn what's right.
Health insurance is a great example of this.
Health insurance is complicated. In one study, 3 out of 4 claimed they felt confident knowing what their plan was all about, when in fact only 1 in 5 could accurately calculate the costs to visit a doctor with that plan. (I'm not kidding!) Transparency could fix the problem, but the industry is so full of jargon, generally intimidating, and frankly, not entirely trustworthy. We take them at their word, because again, you don't know what you don't know, so you can't ask and you can't learn any different.
But we all know what happens when we "trust" anyone with data, money, health. It doesn't always work out for the best. (Remember the mortgage crisis, the dot-bomb crisis, etc.)
The devil is in the details; we need to understand the details in order to understand the best way to work with a system.
Data security is probably the most important element of the Internet of Things movement.
(Personally, I hate the term Internet of Things the same way I hate the term "evergreen content." It's like a new generation of Web-Folk stumbled upon an idea that has existed for years, gave it a name, and now it's a new, exciting movement. What is up with that?)
The vision of the Internet of Things has existed for years - remember the Internet fridge from LG in 2001? The only difference as to why this is so hot now, is that it's more technically feasible and accessible with the emergence of mobile devices and wireless access. It's a truly wonderful vision, but if you get into the details, it quickly becomes a scary (honestly, apocalyptic) vision if data security isn't fully addressed.
Here's a great example I found at Tech Target about how much the details matter in data security with the Internet of Things:
...consider what happened to Affinity Health Plan of New York. In 2010, the CBS Evening News acquired a network-connected photocopier that had been previously leased by the company. The copier was equipped with an internal hard disk and contained protected health information for over 300,000 people. The Department of Health and Human Services fined the company $1.2 million for allowing the data to be exposed.
Although there is probably no need for backing up the contents of a digital copier, this incident serves to illustrate the fact that network-connected devices can sometimes contain substantial amounts of data. For example, some security systems are equipped with hard disks and store everything from video data to employee door-access logs. IT pros must identify the devices that store data internally and then determine if that data needs to be backed up, and how to do so.
--Brien Posey, How the Internet of Things will impact backing up data, TechTarget
That copier-device should have been identified as a data storage device, and the data should have been protected and then swiped when decommissioned. Any one of us could be one of those individuals with exposed health data.
Imagine if some copy clerk was able to see if/when you had chlamydia, cancer, or any other private illness. Sure, he doesn't know who you are on the street, but he has seen your personal information. It's an anonymous violation of your privacy.
In the wrong hands, this isn't just a problem. This is a disaster.
All it takes is a snippet of your private, personal health records to replicate your identity. The research tools to find your complete identity are public and the work to accomplish identity theft is mainly in the research. This is why data security is so important - just a little bit of information goes a long way. It is truly frightening to learn what's easily accessible.
As an example, think about how you may go online to trace who called you using only a phone number. Or how you may Google a guy (or girl) before a date to make sure he is who he says he is. These are harmless identity searches, but identity searches nevertheless.
Now, think about identity searches using the mind of a criminal who wants to use your information to benefit himself.
All he needs is one piece of personal info. The next thing you know, you have maxed out credit cards you never applied for and have a foreclosed house in the Bahamas, even though you can barely pay rent. And no one believes that you didn't do it. Heck, the identity thief even knew about your parrot, Larry, those red heels you bought last week, and the real reason you were on penicillin 5 years ago.
This brings us back to the discussion, why we should care about security (but we don't). More tomorrow.
Comments