The Role of Control and Trust in Privacy Compliance
Tim Walters (The Content Advisory) Posted by Tim Walters (The Content Advisory) October 12, 2018

Dude, Where's My Car? The Role of Trust in DX

The GDPR aims to put consumers ("data subjects") in control of their personal data. Yet, the companies that use that data are called "data controllers." How can we reconcile the two, and what is the role of trust in the transfer of control from the consumer to the company?

The GDPR In Ten Words Or Less

Right at the outset of the GDPR, Recital 7 states:

"Natural persons should have control of their own personal data."

That's ten words that can easily be reduced to just eight, since "natural persons" are just people and "their own personal data" is redundant. So: People should have control of their personal data.

I consider this to be the fundamental principle and prime directive of the GDPR. In a sense, the remainder of the GDPR (page five of the 261 page English PDF) serves to articulate and explain the implications of this principle - ie, if you agree that people should have control of their personal data (and, oh, by the way, you have to agree if you want to be compliant with the GDPR) - here is what you should do.

In a speech at the beginning of 2017, the ICO commissioner Elizabeth Denham (the lead data protection authority for the UK) noted, "People feel that keeping control of their most important information used to be simple, but that over the years, their sense of power over their personal data has slipped its moorings." Indeed, already in 2014, the Pew Research Center reported that 91% of US adults agreed that " consumers have lost control over their data. " The GDPR intends to return the power to the people.

Dude, Where's My Car?

As I noted in my June Crownpeak post , control over one's personal data is all about communication and choice. The analogy I've found useful is borrowing someone's car. Normally, the owner wants to be fully and clearly informed about how the car will be used - how far will it be driven, for what purpose, when will it be returned, etc. Also, the borrower takes on a number of obligations, such as taking extra care not to damage the car and ensuring that it remains safe from theft, etc. (And of course, the borrower should not represent the car as his own property or sell it to someone else - but this is where the analogy with prevailing data processing practices breaks down!) Finally, the owner must be able to freely choose whether or not to loan their car - coerced or forced agreement doesn't count - and to have their car returned at any time.

Similarly, the GDPR includes requirements for transparent notifications and requests to access data, free and fully informed choice , and myriad data subject rights that serve to restrict or revoke permission to process personal data.

From Control to Controller

Nevertheless, whether it's my car or my data, at some point I hand it over to a person or entity and I am no longer entirely in control. Once I give you the keys or submit the online form, you have my stuff and can, practically speaking, do with it as you please. No matter how transparently you communicated your intentions, I can no longer control whether you drive my car twice as far as agreed or share my data with third parties you've never mentioned.

[A couple of provisos. First, it's true that the personal data I provide is an infinitely reproducible copy, rather than a unique object like a car. But that reproducibility in fact only increases the opportunity and danger of abuse. Second, there are numerous initiatives underway that hope to grant consumers direct control of their data even after they provide it to others and through the entire data processing lifecycle. For example, Doc Searls is promoting a number of efforts to provide "personal agency" by making each of us a first party rather than a second party when we deal with sites and apps.]

In apparent recognition of this transfer of control, the GDPR refers to the entities that determine what data is collected and how/why it will be processed as data controllers.*

In other words, despite the prime directive and the emphasis on consumers' control of their data, most data processing activities - ie, doing stuff with my data - entails that I surrender control to a "controller." Or again: my control of my data encompasses my right to make a free, fully informed decision about whether to loan it to you, as well as my right to revoke that decision at (almost) any time. But while you have it, I will in most instances have no way of controlling (or knowing) whether you're using it in a manner that is contrary to my interest, to which I would not have agreed.

The Indispensable Role Of Trust

Precisely these two elements - uncertainty about the actions of another and potential vulnerability as a result of (some of) those actions - form the basis of trust. As one scholar puts it :

"Trust is a state of mind that enables its possessor to be willing to make herslef vulnerable to another - that is, to rely on another despite a positive risk that the other will act in a way that can harm the truster."

Consider how these elements apply to the relationship between a consumer and an organization that wants to process his or her personal data. The company can reduce uncertainty by transparently communicating how the data will be used, shared, and secured. Still, uncertainty cannot be eliminated entirely; the consumer has to trust the organization now to do in the future (only) what it says it will do. Moreover, allowing access to personal data makes one extremely vulnerable. Abuse of the consumer's trust could expose them to financial harm, reputational damage, or even the theft of their very identity.

When I loan personal data to a data controller, trust fills in for my (necessary) loss of control . In this sense, the exchange of personal data is impossible without trust.

GDPR seems to make exactly this point at the beginning of Recital 7, when it notes "the importance of creating trust that will allow the digital economy to develop" - and the very next sentence is the prime directive on consumer control of data. This implies that control builds trust - that is, companies should grant (more) control to consumers so that consumers will (en)trust them with more personal data. But the dynamic also has to work in the other direction - namely, companies should strive to earn trust, so that consumers will grant them control over their data for the purposes of processing.

As consumers assert control and ignite a competition for their personal data, they will consider more than just whether firms will use it properly and securely. The additional and more important factor will be what value they receive in exchange for parting with their data. These value-for-data propositions will be used by consumers to differentiate and evaluate the numerous requests for personal information. The perceived trustworthiness of the requester - ie, the consumer's calculation of the firm's ability to keep their promise and deliver the stated value in the future - then becomes the decisive element that determines which firms gain access to more personal data earlier in the engagement lifecycle and begin to leverage the data-experience positive feedback loops.

infographic on consumer trust over data protection

In the traditional customer lifecycle, trust typically appears near the end - after the product or service has been purchased and has proven worthy. But in the emerging "beg data" era , creating desirable customer experiences will depend on securing consumers' permission to access relatively more scarce personal data. Those companies that can get access to more data earlier in the lifecycle will enjoy a significant competitive advantage.

The obvious question, then, is how can you move trust "upstream" in the lifecycle? How can you give consumers a reason to trust you before they've used your stuff - indeed, possibly even before a sales conversation has even started? It's an excellent question - and the subject of my next Crownpeak blog post.

*Eduardo Ustaran, author of the masterful The Future of Privacy , pointed out that in the UK Data Protection Act of 1984, such entities were called data users.

By: Tim Walters, Ph.D., Privacy Lead at The Content Advisory