Companies should ask themselves how they could go a step further to truly embody the values of GDPR
Julia Janssen is an artist, designer, public speaker and tutor at ArtEZ University of the Arts in Arnhem, the Netherlands. Through her creative installations and performance art, she strives to make the digitalisation of society – including concepts such as digital identity and data sovereignty – tangible in the physical world. Here, she explains why and how businesses can benefit by taking proactive steps to improve their digital sustainability.
You describe your mission as “to inspire and empower people to access better fundamental rights in the digital world”. How do you go about that?
Since my days at art school, I’ve been fascinated by the relationship between humans and technology. Why are people willing to give away so much information about themselves on the internet? Why do certain fundamental principles from the physical world not seem to apply in the digital world? For example, there are ingredient lists, cooking instructions, health warnings and disclaimers on food to keep us safe, but in the digital world there are no warnings about the potential risks and impact of sharing your data. So how can we safeguard concepts such as fairness, equality, autonomy, freedom and democracy in a data-driven society?
First of all, it’s important to make people aware of all these issues. But the beauty of art is that it also helps you to create movement. So for me, it’s not just about helping people to understand what’s going on, but also about making them want to become part of the movement towards change… towards enjoying the benefits of the internet and connectivity without having to abandon our autonomy. Therefore, it’s important for my creations to have multiple layers; the overall message needs to be clear, accessible and easy to grasp for a broad audience in a matter of seconds. But it must also be possible for consumers, professionals and politicians to dive deeper and think more closely about the underlying implications if they want to.
Privacy and data sovereignty are pretty dry and dense topics, so I try to introduce a touch of humour or irony into my work. And of course an interactive element helps people to experience the digital world in a real-world context. For example, I discovered that one click on ‘Accept all’ (taking just 0.0146 seconds) was enough to accept 835 different privacy policies. So I’ve printed them all out in a huge book and I ask members of the public to read them out loud in interactive performances – which would take over 400 hours in total.
The contracts are endless sentences of legal jargon that are almost impossible to understand, which make a mockery of the idea of ‘informed consent’. In another interactive performance at a conference, I sold the book with a note next to it saying ‘By buying this book, you automatically agree to being tracked for the rest of the day’ but none of the customers seemed to care – until they realised I had hired people as ‘Trackers’ to literally follow them around all day, taking notes and photos. It was a great way of giving a physical impression of today’s very opaque online consent systems, and showing that we tolerate things in the digital world that we wouldn’t tolerate at all in the physical space.
What does ‘digital sustainability’ mean to you? And how does that fit in with your work?
I think there are several angles to digital sustainability. Firstly, there seems to be an unstoppable hunger for data because organisations think that more data always means better results or insights. This is resulting in an ever-growing mountain of data which on a very basic level has an environmental impact due to all the servers needed to store it. As an extension of this, the infrastructure for processing and sharing all that data is overly complex and inefficient, so lots of data is being collected unnecessarily or wasted, which of course doesn’t fit with the ‘sustainable’ use of resources.
On another level, I think that today’s debate about people’s online rights is too often reduced to ‘privacy’, but for me that’s not the right word to describe the real issue. It’s not just about whether a person has anything to hide, but also about associated values such as autonomy, freedom of choice, equal opportunities and democracy.
So digital sustainability is about ensuring that the collection, processing and exchange of data does not interfere with any of these values. Much of the problem stems from how data is being combined to build detailed profiles of people for personal targeting purposes – and then algorithms are let loose which can influence people’s choices at both an individual, commercial and societal level… to the extent that even elections can be manipulated. The problem with this is that the system is so opaque that you don’t know what information the algorithms are preventing you from seeing.
We tend to think that artificial intelligence is inherently objective and neutral, but the fact is that algorithms are trained based on traditional or historical patterns – plus humans are part of that process, so human bias can become built into them. For example, LinkedIn uses an algorithm to target job vacancies towards potentially suitable candidates based on personal profiling as a predictor of future success. If the algorithm has been trained using one-sided datasets, people who don’t have the ‘right’ profile – whether due to their age, gender, appearance, background or whatever – won’t even get to see the vacancy.
This is just one example of how personalised reality can result in hidden discrimination that can have huge consequences for a person’s future. Facebook is another example; it drags endless amounts of data into its system from way beyond its own platform, either by placing pixels, through third-party authentication or simply by buying up companies like it did with Instagram.
To make people more aware of this, I’m working on a new art installation called “Dear Data, how do you do decide my future?’ based on my analysis of the Facebook profiling structure. I’m visualising this infrastructure using over 3,500 ping-pong balls, with each one representing one item of data. The whole installation will be at least 4 metres long when it’s finished and I will display it at the Lowlands music festival later this summer. It will be part of an interactive installation where people can also play games relating to how the categorisation and classification of datasets can influence life choices and chances.
On a third level, in order to gather data and build profiles so that they can target people more specifically, most websites and apps are designed to maximise the amount of time spent on them. That’s bad for the physical health of human beings, but I also feel that it’s making people mentally obedient to the technology; we’re becoming less mindful of our own autonomy and less able to think for ourselves. Humanity is degrading as a species, which of course is not a sustainable outlook for the human race as a whole.
What is the general level of awareness of digital sustainability and how is this changing?
I found it interesting to see that the government-led COVID-19 apps sparked such concerns about privacy and surveillance issues. It’s great that people are starting to think about these issues, but it was as if they didn’t realise that they should have the same concerns about every app they’ve been using for years. And it’s not so much the government they should be concerned about, but the capitalist Big Techs who are harvesting data purely to satisfy their own commercial interests.
But awareness definitely seems to be growing; more and more people tell me they are uneasy about simply clicking ‘Accept’ but feel trapped into giving their consent because the only alternative is not to use the service. There are some options, such as using Signal instead of WhatsApp; it’s not really the answer, but if enough people switch it will send a clear message to the industry about what they want.
And on the upside, most people I meet – whether consumers or professional organisations – are eager to enter into discussion about the themes expressed through my work. It’s good that the European Commission is now taking steps to introduce legislation giving people more control over their data, but this will take some time and isn’t the only solution. In fact, I believe that companies really stand to benefit by showing that they are taking a different approach and proactively offering fairer and more transparent alternatives. I think they will win people’s loyalty and gain a big fan base.
What are the biggest challenges for companies and organisations that want to take a more digitally sustainable approach and how can they overcome them?
I think many companies feel suppressed by how the current system is set up and trapped by their existing revenue models, partnerships and contractual obligations. It takes courage and effort to strike out and do things differently, plus it’s not without risk, so it’s often easier to just mimic what everyone else is doing.
However, there is growing regulatory and societal pressure for companies to adopt a more digitally sustainable approach and the technology and infrastructures are increasingly available to facilitate this. Companies can start by taking small steps, such as looking at how they ask for consent. Are users manipulated into clicking on the big, bright green ‘Accept all’ button instead of the more subtle (or non-existent) ‘Change/Reject’ button? If so, change it!
And companies should be honest with themselves and re-evaluate precisely what data they gather and process and why – do they really need to profile people so deeply in so many segments? And should they be sharing that data with third parties? My own personal recommendation is to immediately stop tracking live location data if you don’t actually need it.
By the way, many companies justify what they do by saying that they are in compliance with GDPR. Firstly, they often aren’t – because GDPR stipulates that consent must be freely given, specific, informed and unambiguous… on a voluntary basis. As I’ve already mentioned, there are many cases in which the privacy statements fail on some if not all of these points. And then there’s a difference between what is ‘legal’ and what’s ‘ethical’. Even if companies are loyal to GDPR on paper, they should ask themselves how they could go a step further to truly embody its values in practice.
Which companies or organisations have caught your eye by putting digital sustainability into practice successfully?
I’m impressed by the Dutch start-up called Pacmed, which against the backdrop of staff shortages and the rising demand for healthcare is using machine learning to support decision-making by medical professionals. In anticipation of GDPR, the system has been designed from the ground up with a sustainable approach to the data assets in mind, so the solution revolves around providing access to the data rather than ownership of it. There are a growing number of other systems in other industries that work in a similar way so that the data owner retains control.
Meanwhile, in the media sector, I’m encouraged to see a number of outlets including Ster (which sells advertising on behalf of the Dutch public broadcaster NPO) and the New York Times replacing personally targeted advertising with contextual advertising – and it seems that their revenues have actually increased, which proves that it can be worthwhile for businesses to explore alternatives.
Lastly, there is still a lot of work to be done to make technology and algorithms less discriminatory by design. In this context, I’m very excited to be working on a new creative project commissioned by the Dutch Ministry of the Interior. Together with a team of lawyers, my task is to visualise how technological discrimination influences people, and I hope that the result will play a part in raising awareness and inspiring action to overcome these challenges in the future.