Searcher
Back

Privacy: the challenge of putting the customer in control

Many people consider that disclosure of personal data is increasingly a part of modern life. In Europe, 74% of Europeans see disclosing personal information as necessary to access online services, specifically social networking and sharing sites (61%) and online shopping (79%).

 

However, 72% of internet users in Europe still worry that they are being asked for too much personal data online (see Special Eurobarometer 359, Attitudes on Data Protection and Electronic Identity in the European Union).

 

They feel they are not in control of their data. Therefore, putting the customer in control is a key priority for regulators (EU New European Regulation on Data Protection) and increasingly for companies who live or plan to live on customer data.

 

Last year, in Telefonica I+D we conducted an ethnographic research to test two basic tenets of the current and widely accepted approach toward online privacy(*):

  • Giving control and transparency to users over their personal data will increase their  trust on services and service providers
  • Increasing trust will motivate users to accept new ways of data sharing while reducing churn and reputational risks

 

What we found is that increasing awareness by itself does not represent an increased value to users, nor result necessarily in higher confidence in the service. Quite the contrary:

  • Awareness can increase the perception of risk. In fact, it decreases trust in the service offered and the service provider when users can’t answer themselves basic questions of how, why, what and what for. It lowers trust even further when users discover unexpected uses of their information.
  • Control can increase the perception of work. Managing data is not necessarily appreciated as a valuable feature for users. When using a service, users look for maximizing the benefits of it, not for managing the data related to it. For the average user this is particularly challenging, as even when presented with answers, he or she doesn’t have the basic technical knowledge to interpret them.

 

Users perceive extra value and higher trust, only if they clearly understand the causes, consequences and compensation of allowing the use of their data. Cause and effect must be clear to the user so he or she can recognize the value of personalized content. The perception of risk the user might feel could be overcome by a perception of high value. Should users think the benefit is worth the chance, they might ignore the perceived risk, or could willingly adopt behaviours that are perceived as risky.

 

Users are more willing to share data when they expect to:

  • receive free or discounted services
  • receive a faster delivery of service
  • increase social reputation
  • socialize and connect

 

On the other hand, users are more reluctant to share data whenever they perceive a risk to their

  • personal or group intimacy
  • economic welfare
  • physical safety
  • social reputation

 

In the research we also found some variables which describe privacy attitudes more accurately than current demographic profiles:

  • Need to exchange data (context, work, social).
  • Frequency of use of digital services
  • Level of awareness
  • Perception of control
  • Type of reaction to threat/risk
  • Perception of risk
  • Level of media influence

 

Using these variables the research team was able to define the following four different profiles, attending to their attitudes toward privacy and security in relation to digital services:

  • Connected: They are intensive users of digital services. Connected use a wide range of services, being social or professional. Connected is the most aware profile concerning privacy and security in the digital world. They have more or less sophisticated mental models about how data, privacy and security work. They also have specific strategies to avoid problems or minimize risk. They feel in control and work on reducing risks, but at the same time, they accept the inherent risk of the digital world.
  • Digital Life Beginners: Users not having a long experience using digital services. Most typically accessing exclusively from the PC. Main services they use are social networks and discount services. They only have a basic understanding of how privacy and security works in the digital world. This makes them feel unsure about the consequences and have no strategies to avoid problems or minimize risk, but ask friends or relatives for help when privacy or security could be compromised.
  • Conventionals haven’t many skills or much experience in digital world. They make basic Internet use: checking their bank accounts or the email. They haven’t got a clear mental model of how the digital world works and they perceive security and privacy risks, above all they are afraid of economic damage. They use very primary protection strategies and need friends to help them. “Conventionals” delegate complex tasks to their helpers.
  • Entertainment seekers focus their Internet activity on watching videos and playing videogames. They have very little awareness and are very carefree. They don’t rate the risks or consequences of privacy and security. Their disregard of consequences makes them assume risky behaviour patterns and lack or fail to develop protection strategies.

 

Customer segmentation is not at all new to business and economics, but it is an acute need when you deal with digital services and specifically data based services today. Taking into account that the opportunity to become a player in the customer data exploitation is now, understanding customer attitudes toward privacy and security in the Digital World is a must.

 

There is an additional finding which poses what we consider a particularly relevant challenge, especially in the light of the new European Data Protection Regulation put forward by the European Commission recently.

 

One of the Commission's proposals obliges companies and data controllers’ organizations to notify data breaches without undue delay (which, where feasible, should be within 24 hours) to both data protection authorities and the individuals concerned. This sounds sensible, but what users want is to feel and know that their data is safe when using digital services. And whenever unexpected consequences arise, users should be able not only to understand them but to know what they should do in order to correct them, no matter their skill level.

 

(*)Special thanks to the UX team for their contribution to the work done, we would particularly like to mention Pamela Mead, Carlos González de Herrero and Ricardo Márquez.

 

 

Francisco José Jariego Fente

Enablers & Technology Director

Telefónica Digital

Twitter: @fjjariego

 

 

José Enrique López García

Product&Biz specialist

Telefónica Digital

Twitter: @jelg0303

 

 

Silvia Cabanillas

Leader of the initiative Customer in Control - User Modelling

Telefónica Digital

Twitter: @scabanillas

RELATED POSTS

Lourdes Tejedor / @madrid2day

Telefónica Public Policy & Telefónica España Regulatory teams