I've wanted to get a Fitbit to track my workouts - from gym to belly dancing to walking. And I really want to get one so I can participate at the UnderArmour Record site. But I'm reluctant to buy one. Why? Because I wonder where my data goes when Fitbit gets it.
(The irony - I'm fine putting this data on the UA site, but not giving it to Fitbit. I think it's the social sharing reason more than anything else - there is a larger purpose to sharing the data. It's almost as if sharing negates the risk of exposing personal data.)
Sure, if I were to use a Fitbit, the data would go to their systems and according to their privacy policy, they would store it and wouldn't use it elsewhere. I'm sure they wouldn't use it, but I'm paranoid about hacking and if that data could be used to create a profile of me that could be used as a type of identity.
The questions that race through my head: Can they be hacked? What types of servers will have my data? How many? What will happen to my data on servers that are decommissioned?
Although we live in a world where our data naivete has been shattered by Edward Snowden, Julian Assange/Wikileaks, and Anonymous, it seems as if there is still an apathetic view towards data privacy and security. The NSA has been rummaging through our personal data without proper authorization in the name of security. Identity theft is rampant - check out a hacker news site. We all know at least one person who has had a credit card number stolen from an online store (if it hasn't happened to you already). With all of this going on with our data, one would think that we would be more paranoid about what happens when a company gets it.
It seems that no one cares about privacy and security anymore. Privacy policies are important, but people seem to care as much about this as they do terms and conditions legalese when downloading software. It has turned irrelevant.
Privacy policies were created during a time when companies were collecting names, addresses, phone numbers, and emails as leads. For that purpose, telling you that they were storing this info and not sharing it with third parties was being transparent enough.
Fast forward 20 years. Today, companies collect much more personal, sensitive information from users - ranging from credit card and other payment information, health information, pictures (depending on the content, can be incriminating), identity information and more. Also, some cloud companies are now collecting other company's electronic documents to store, and some of that information may be competitive and confidential.
Most users don't know what really happens to their data once they type it into a browser and submit it to a company. The journey of your data can be fascinating, going across countries, encrypted, possibly replicated, depending on the infrastructure setup. Most users think they don't need to understand this and it should stay a mystery, but what if the company you were sharing with didn't really use best practices to transport and store data? How would you know you were exposed to great personal risk?
At the Digital Dallas Summit in December, mobile leaders predicted that if we really want to make a go of these apps that store sensitive, personal data, we need to elevate awareness about security and privacy. I strongly agree with them and wrote about this in the 2015 trends article at UX Magazine.
Box.net has started addressing this problem with their extended version of a privacy policy. They tell you the journey of data you trust to them to store, how it gets processed along the way, and how they protect it. It's awesome!
I think what they are exploring is exactly what users need to understand in today's data world. Simply knowing the legal use of data isn't enough anymore - users need greater protections and education about what is happening to their data when they give it to a company.
Here are 5 key changes that need to happen for us to move to a new era of data usage and adoption for anyone to be more willing and open to share data:
- The industry needs to communicate best practices for backup and recovery systems - and teach users what they should minimally expect from a vendor
- The industry needs to communicate best practices to prevent hacking - and educate users about they should minimally expect from a vendor so their data isn't hacked
- Companies need to educate users how data travels in their systems from data entry on a form to long-term storage at a data center
- Companies need to openly tell users the risk of sharing data with them - what could be used in identity theft, what could happen to an account if hacked, etc.
- Companies need to educate users about their responsibilities to keep their data safe and protected beyond passwords
How do these ideas contribute to a new type of privacy policy?
- Include diagrams: be transparent and show the user where his data travels in a simple, easy to understand way. All users need to understand where their data is going. As technologists, we often believe users won't understand this, but they will if we explain it to them properly. Sometimes we want to cut corners, do something non-standard and withhold that information from a user, as if that makes it ok. The user is trusting a company's system to store their data, and if that system isn't using proper industry standard security protocols, then the user should understand that during account creation and accept that risk.
- Educate users about industry data practices. Educating users about data security is the same as educating people about FDIC, banking, and what will insure their money. Users should understand what https is; how backup systems work; why data protection and encryption is needed. In today's world, this is now part of our baseline knowledge - it shouldn't be a magical mystery anymore.
- Have surgeon general type warnings for users about the risks of sharing their data. People smoke even though they know it may kill them. Users may share data with an insecure system, but they need to understand the risk of doing so. It's only fair - it is their data and they need to understand the risks. Most users don't think like a hacker and do not easily see how their data could be used against them. It is a company's responsibility to expose that risk to them.
- Tell users what happens to their data when servers/appliances are decommissioned. A laptop computer will hold deleted data as a shadow on the hard drive -why you always want to salvage your hard drive and keep it safe. Personal data on that drive can always be reconstructed in forensic computing. What happens to your data on a machine that is replaced? How does the wiping happen before it is discarded? That policy is key to understand data risks if someone stole old equipment.
In this new world driven by data, it is about time that users had a clear understanding of where their data goes, what happens to it, and the risks of giving it to a company. When we invest money, we are told about the risks. When we mail a package, we are told the risks. When we cross the street, we understand the risks. However, when users give data to a company they are only told that the company won't share our information with third parties. That's not enough. Users should know how their data is secured.
On the flip side, for a user to claim that it is ok not to understand how his data flows at a company's Web site is the same as someone saying he doesn't understand how to manage his money. If you know where your dollars go, you should understand where your data goes. It is that simple.
Hi Robert! Great to hear from you and glad you liked it :) Thanks!
The apathy part of security is a tough one. At the conference I was at, some of the mobile leaders were saying that sadly, a major security breech of personal information may happen before anyone takes it seriously. I hope that doesn't happen. I wonder if we educated people more about where their data goes rather than just say, "it's safe and encrypted and we won't see your data," it will help shift that perception.
And for opt-out - I agree. Users should know what happens to their data for that situation too. We know what happens because we are in the business; I think if most people knew what we know, they wouldn't be so trusting of systems and stay on the cynical side.
Posted by: Mary Brodie | Sunday, January 18, 2015 at 11:11 PM
Nicely done Mary! But how do you overcome apathy? Also even in an opt out scenario, there is still the need to collect "essential" data as part of basic operations
Posted by: Robert Hitomi | Saturday, January 17, 2015 at 04:25 PM