Companies must contribute to a digitally sustainable society by ensuring fair use of data rather than focusing on tick-the-box consent

Lokke Moerel

As a globally renowned expert on data privacy and cybersecurity, Lokke Moerel spends much of her time helping some of the world’s most complex multinational organisations to confront their global privacy and ethical challenges when implementing new technologies and digital business models. Besides her role as private practice lawyer in the global privacy & security team at Morrison & Foerster, she is also a professor of global ICT law at Tilburg University and a member of the Dutch Cyber Security Council. Here, Lokke explains why recent proposals to compensate consumers for the use of their data undermine digital sustainability.

Concerns about the excesses of the new data economy are widespread and lead to calls for new privacy legislation. The growing sentiment is that Big Tech companies disproportionally profit from consumers’ data and that consumers should ‘share the wealth’ that these companies generate from their data. To address the yawning economic gap, especially in the US, proposals are being launched to grant consumers a ‘data dividend’ and to create ‘data unions’ to negotiate payment terms on their behalf.

You strongly believe that these proposals are not the right solutions. Why?

The thinking behind ‘sharing the wealth’ is that giving individuals control over their data (by introducing privacy consent requirements) will enable them to leverage this power to gain a better economic return on it. Although these attempts are commendable – after all, who can be against fair compensation? – the proposals are actually counterproductive. They will not address the excesses of the current data economy. The remedy here is worse than the ailment. To illustrate the underlying issue, let’s take the example of misleading advertising and unfair trade practices. If an advertisement is misleading or a trade practice unfair, it is intuitively understood that companies should not be able to remedy this situation simply by obtaining the consumer’s consent. In the same vein, if companies generate large revenues with unfair data processing practices, the solution is not to ensure consumers get their fair share of the illicitly obtained revenues. That would just sustain those practices!

You also believe that putting a ‘price tag’ on personal data is ultimately to the detriment of consumers. Why?

The ‘share the wealth’ proposals measure the value of data by its value to the company. But this value may have little correlation with the privacy risks to the consumer. The value to consumers is based on the potential impact if their data is lost or misused. For example, the search history of someone still exploring their gender identity may have no value to a company, but it could cause significant distress to that person if this data becomes public.

The impact also depends on the combination of elements: an email-address in itself is not sensitive, but if it is leaked in combination with a password it may lead to fraud or identity theft, especially because many people still use the same password for multiple accounts. And in one recent case, email addresses suddenly became very sensitive when they were leaked from a website that helped students to cheat in their exams.

But even setting one valuation method for the value of data to a company has proven impossible – and believe me, the experts have tried! The value of data to a company depends on the relevant use, which may well differ between companies. Health data will be relevant for a pharmaceutical company, but less so for a company that wants to sell new TVs.

Furthermore, the real value is not in the raw data, but in the insights derived from analysing the combination of data elements or insights across all customers. For example, analysis of a consumer’s food purchases can lead to predictions of who will get diabetes in 10 years’ time – but how do you price that? The bottom line is that there are too many variables. There are so many combinations of data elements, so many use cases, so many potential future use cases and so many possibilities of potential harm, that any price tag on data is bound to be to the detriment of the consumer. All of this illustrates that privacy protection should not be thought of as a right that can be traded or sold, and that consumers will be short-changed by any proposals to compensate them for the use of their data.

Isn’t that why the GDPR was introduced in Europe? To ensure that data is only used with the owner’s consent?

Yes, that was the idea. However, the realisation has set in that the consent-based model of privacy protection, including under the GDPR in Europe, also has its flaws. The assumption of the consent model is that as long as you tell me what data you collect from me and for what purposes, I can make an informed decision.

The underlying logic of data processing operations and the purposes for which they are used have now become so complex that they can only be described by means of intricate privacy policies that are simply not comprehensible to the average citizen. Furthermore, the incentives for online service providers to obtain your data are so high that they use all kinds of tricks to ‘encourage’ you to give consent (or make it difficult to opt out).

I think we’ve all clicked on ‘accept all cookies’ because it is simply too much effort to find out how to reject consent (which is bizarre if you think that the company needs your consent, so why would you have to reject consent in the first place?). Many of these techniques can be seen as unfair trade practices. And it’s not just rogue traders doing this – even very reputable companies are ‘tricking’ people into consenting to use of their data in a similar manner. It’s as if all morals have flown out of the window in the digital environment!

So how can we create a fair digital society?

Our data protection laws have resulted in what I have coined ‘mechanical proceduralism’; organisations go through the mechanics of notice and consent without any reflection on whether the relevant use of data is actually legitimate. We even see this reflected in the highest EU court having to decide whether a pre-ticked box constitutes consent (surprise: it does not).

Privacy legislation needs to regain its role of determining what is and what is not permissible. We need to have the difficult discussion around where the red line for data use should be, rather than passing the responsibility for a fair digital society to individuals to make choices they cannot fully comprehend. That’s the essence of digital sustainability.

How can digital sustainability be beneficial for companies as well as for consumers?

The media industry is a good example. For a long time we have heard that the end of cookie walls (forcing users to accept tracking cookies) would be the end of ‘free’ online news, which has been financed by personalised advertising facilitated by the global ad-tech ecosystem.

Now that cookie walls are no longer allowed in the EU, we suddenly see that the online news sites are shifting to contextual advertising instead of personalised profiling. This means that the advertising is aligned with the content you are reading rather than your personal profile. And guess what? The Dutch public broadcaster, for example, actually saw its advertising income increase by 30%. To be honest, that doesn’t surprise me.

The boundaries between private life and business life are becomingly increasingly blurred, and most of us have probably experienced seeing online advertisements based on our search history in unexpected and sometimes inappropriate settings. That makes us feel uneasy. Is that the basis on which you want to build trust among your customers in today’s data economy? I think not! Privacy really pays.

Do any examples spring to mind of organisations that are already putting digital sustainability into practice successfully?

A number of high-profile organisations – such as the Dutch newspaper NRC – have changed their business models and shifted away from tracking cookies to AI-supported contextual advertising. This didn’t happen overnight because they had got used to having their advertising automatically delivered to them by the ad-tech system, so they first needed to develop their own technology and new sales expertise and regain their closeness to the market.

But it’s definitely more beneficial in the long run and is more digitally sustainable too, because they have cut out the middleman and are no longer dependent on third-party data. In fact, NRC is now generating more advertising revenue than before! Again, this demonstrates that the end of tracking cookies doesn’t necessarily mean the end of free news, it just means that companies have to be more creative. And that actually holds true for all companies, not just media businesses.

A digital sustainability mindset means building that all-important trust by being open and transparent about your intentions and ensuring users share the benefits. Users can spot self-interest from a mile away, which undermines trust. Even in the face of new European regulations and initiatives such as the EU Digital ID Wallet, which will make it harder for businesses to gather and combine data, digitally sustainable companies will succeed in obtaining consent if the opportunities they create are really also in the interest of the customer.

Let's get in touch

Ready to do business with the experts at INNOPAY?