Last week Pew Research published the results of a study on how privacy and sharing personal information influence users’ adoption of different technologies such as social media, smart thermostats and retail loyalty cards. Users were asked about specific privacy and data trade-offs they would make in order to use a specific device or service. The results were interesting in their negativity and also in the suspicions users had about what data was being collected and how it was being used.
One of the scenarios explored was auto insurance and the usage of a tracking device. Users were asked if they would use a device that monitors driving speed and location and collects data about personal driving habits and in return might offer discounts in insurance coverage. 37% of users replied that that trade-off was acceptable but 45% said it was not. The acceptance numbers are even lower in real life. Progressive claims that while 80% of its customers could potentially benefit by using its tracking device, only about 25% participate, even though customers only need to use the tracker for six months. Likewise, Allstate says that the adoption rate of its newer, smartphone-based tracker is around 30%. Part of the reason for this slow adoption, speculates the Wall Street Journal, is a fear that the insurance companies will use the data to raise rates. In fact, even though until 2014 Progressive did not penalize drivers whose behavior it deemed risky, it now says it could raise their rates by as much as 10%. Finally, admits Glenn Renwick, Progressive’s CEO “Insurance is not something where people say, ‘I trust you,” and drivers just don’t want insurance companies having that data.”
The interesting takeaway in both the research and the tracking device adoption rate is that in both cases the trade-off between giving up personal data and receiving a benefit or service was made clear to the user, who then made an informed decision. The problem is, and this was also reflected in the Pew report, that most of the times users are not aware of the hidden privacy costs of a certain service or products and are becoming wary of existing and new products and services. From the report: “One of the most unsettling aspects of privacy issues to many of the focus group participants is how hard they feel it is to get information about what is collected and uncertainty about who is collecting the data.”
This isn’t surprising. Upon signing up for a new service or installing a new smart device users are asked to agree to the privacy statement. This is always a long, legal document where the “what data is collected” section is usually broad and vague and what is done with that data is even vaguer. There is usually no option to use the device or service without providing access to all of that data. For example, Nest’s thermostat privacy statement says it collects “data from several sensors built into the Nest Learning
Thermostat. These sensors collect data such as current temperature, humidity and ambient light in the room. They can also sense whether something in the room is moving.” Users cannot opt out of just the motion detector sensor, they have to accept them all. Interestingly, 55% of the Pew Research respondents to this scenario said it was an unacceptable trade-off.
There is also a sense of powerlessness when devices update at will, without user approval. Still talking about Nest, a recent glitch caused many customers’ devices to turn off, making houses unpleasantly, and in some cases dangerously, cold. Were customers aware that by allowing automatic upgrades they could end up with an inoperable thermostat? The benefits they wanted were a learning thermometer with remote control, not a motion tracker that could turn off at inopportune moments.
Finally, there is also a question of cost. Users are more likely to agree to a privacy trade-off when a service is free. Social media and email are just two examples given by users where they find it more acceptable to provide personal information in return for a service. In comparison, the Nest thermostat costs $249 and nowhere on the purchase page (aside from the tiny privacy link on the bottom) does it say that users will need to agree to the Privacy Statement to use it or that they’ll need to download an app that requires access to Identity and Contacts. Is it an intentional omission or an innocent assumption that users are aware of these requirements?
What are the implications for product managers? First, a real need to make these trade-offs clearer to users. Hiding these trade-offs in a statement rarely read by users may make
lawyers happy but doesn’t promote trust. It also needs to be clearer during the purchasing process, and not presented, fait accompli, during setup. Second, collect only the data necessary to make the product work and explain why certain data are collected. Third, provide options to disable certain features if users are uncomfortable with the data necessary to power them.
The Pew research has shown that a significant number of users are uncomfortable with data trade-offs they are required to make in order to use different products and services. In the end, if not addressed, this unease, whether warranted or not, will slow down adoption of new products.