5 Questions about Digital Sustainability to… Prof Dr José van Dijck

Jose van Dijck

José van Dijck has been a Distinguished University Professor of Media and Digital Society at Utrecht University since 2017. She is leading researcher in media, social media and media technologies. In this interview, she discusses digital sustainability in the context of the ‘platform society’ and emphasises the importance of focusing on public values rather than commercial or corporate ones.

“Businesses and organisations can shoulder their responsibility for digital sustainability by changing the way they think. Consider the bigger picture!”

What does the term ‘digital sustainability’ mean to you?

What I like about the concept of ‘digital sustainability’ is its broadness to cover many different aspects of life in the digital world. Of course, as with ‘traditional’ sustainability, the environmental aspect is very important. After all, the internet already accounts for around 4% of all global greenhouse gas emissions. Due to the cloud computing power and data centres needed to support the emerging AI tools, this is expected to rise sharply to as much as 20% according to some estimates. Therefore, to support a digitally sustainable world, tech-related investments should go hand in hand with energy reduction as a strict goal. From a broader economic perspective, digital sustainability refers to responsible implementation of digital tools in all kinds of sectors – from healthcare and media, to mobility and finance – to serve public values rather than only commercial or corporate values. In healthcare, for example, a focus on efficiency goals could push the use of digital apps and robots. While these can be supportive tools, we should not lose sight of the fact that such aids can never replace the need for human care. In addition, digital sustainability has relevance in terms of social and ethical responsibility because it can helps us to not only preserve but also actively add public and human value in the digital world we’re shaping. This can be done by ensuring that algorithms and data are always used in compliance with public value requirements. I see a strong overlap between all these aspects and the UN’s Sustainable Development Goals (SDGs). Therefore, I think they can serve as a good framework for harnessing the tools of digital transformation in a digitally sustainable manner.

How does digital sustainability tie in with the book that you co-wrote, The Platform Society: Public Values in a Connective World?

In our book, we very much focused on public values (privacy, security, transparency, accountability, fairness/non-discrimination, open to democratic control, autonomy, sustainability) as a basic point of departure for designing and developing digital platforms, infrastructures and policies. We organised the book around the ‘Platform Society’ as a contested concept. In other words, public values are not a given – you cannot buy them off the shelves; instead, they are contested between actors with varying – and sometimes competing – interests. For example, corporations that are interested in creating economic value by gathering data to improve efficiency are increasingly clashing with governments that want to protect the public values of privacy and security.

In the book, we looked at both the public and private sectors by considering four different areas: education and healthcare (public), and mobility and news (private). As the global ‘Big Tech’ players gained more power, social traffic increasingly ran via their online platforms. Their interfaces, reputation systems and algorithms linking supply and demand were steering the design of society. We examined the potential consequences for public interests in terms of the accessibility, safety and affordability of public transport, pluralism in journalism or autonomy in the organisation of education.

What have been the most significant changes since then?

Our book came out in 2018 – that’s six years ago, which seems like eons in the fast-paced digital world. In fact, we wrote the original book in the Dutch language in 2015. When we decided to publish an English-language version for a global audience, we realised so much had changed in the subsequent three years that it wouldn’t be enough to simply translate in. We ended up pretty much rewriting the whole thing! So while our book was quite innovative at the time, obviously there have been several big changes since then.

First and foremost, there have been major legal and regulatory advances, especially in the EU. The first one was introduced in 2018, when the GDPR forced the Big Tech platforms to comply with privacy regulations. Since then, other EU frameworks have been or are being implemented, such as the Digital Services Act (DSA) and the Digital Markets Act (DMA) framework, the EU Data Act, the EU Artificial Intelligence Act and the Digital Operations Resilience Act. All of these legal frameworks set clear boundaries and will therefore be very important in terms of not just protecting but also redesigning and reshaping the public platform society.

Secondly, we have seen huge geopolitical changes. In the final chapter of the book, we predicted the emergence of a strong Chinese block of platform ecosystems to rival the American ‘big five’ (Google, Amazon, Facebook, Apple and Microsoft) in the USA. This block of several big players – Baidu, Alibaba, Tencent and Bytedance (TikTok) – not only emerged, but it happened almost immediately, much faster than we expected. This, together with the further concentration of power among the big five, has intensified the geopolitical contest between the USA and China. 

This has been coupled with huge technical changes. The emergence of AI and ChatGPT last year signalled a new technological breakthrough in a field where Europe has lagged behind, both technically and from a regulatory perspective. Now, with AI on the rise, we need to re-evaluate what is at stake in the platform society. After all, AI is already woven into all kinds of digital tools and across sectors. 

I find it helps to view the platform society as an ecosystem with three layers of power, and I often use the metaphor of a tree. I see the roots are the ‘wiring’ layer of the internet – the hardware and infrastructural services supporting digital communications. The trunk is the powerful intermediate layer encompassing ‘general purpose’ services, such as search engines, app stores, social networks, pay systems, identification services, navigation and advertising. These are overwhelmingly operated by the big five platforms. The branches form the final layer; I call these ‘sectoral’ platforms – software services designed for a particular industry or sector. The big five have increasingly infiltrated into almost all sectors of society, both public and private, and these branches are all becoming dependent on them. 

In this metaphor, we could also see the data flows as ‘oxygen’ for AI, and algorithms as the ‘water’ that is absorbed, feeding the tree from bottom to top. And because the big five are now involved in all three layers of the tree, they can start vertically integrating data flows, which gives them unprecedented power in the platform ecosystem. But how can we make our world digitally sustainable when just a handful of corporations are in charge rather than democratic state systems? That’s why this ecosystem-based thinking is so important, so that we understand the complexity of the platform society and how it is affecting all our lives – and how to fix it.

Importantly from an EU perspective, while we have been strong in regulation, we have been weak in developing our own alternatives in the ‘trunk’ layer of the tree – the all-important layer connecting the ‘wiring’ with the ‘branches’. The new acts may fill some of the gaps for the sector companies. However, in view of the contested infrastructure in today’s global digital ecosystem, all of Europe’s political, economic and social concerns should be anchored in concern about digital sustainability.

How can businesses and organisations play a role in improving digital sustainability?

In the context of environmental sustainability, it is becoming increasingly normal for individuals to take responsibility for the planet’s future through their own actions: eating less meat, changing the way they travel, and so on. We now need to create the same kind of awareness and sense of responsibility for shaping a digitally sustainable future. Individuals can contribute by using digital tools such as AI more mindfully, so that they are not unnecessarily consuming energy and natural resources that could be better used for other purposes. 

Corporations and businesses can shoulder this responsibility by changing the way they think. When developing a digital product, don’t just treat it as a tool to be marketed, but consider the bigger picture. From an environmental perspective, this means taking steps to reduce how much energy and/or water is consumed by digital infrastructures. Especially since AI is set to dramatically increase the use of natural resources, as I mentioned earlier, developers of AI and other data-intensive apps should make efforts to develop solutions that are ‘green by design’. 

Privacy is another important public value in the context of digital sustainabilty. The GDPR has helped to improve awareness of privacy, but ‘privacy by design’ has still not become commonplace. And privacy doesn’t go far enough, in my opinion; we should be focusing on ‘democratic control by design’. For example, my insurance company recently offered me a personalised premium, but when I asked about the variables they’d used, they wouldn’t (or couldn’t) tell me. “It was simply the algorithm,” they said. So how can they be held accountable? You should be able to explain algorithms to citizens and consumers; only then can data truly be used for fairness and transparency. I truly believe that companies who adjust to this new way of thinking stand to benefit as society’s need for trust and expectations of openness continue to evolve. 

The responsibility for shaping a digitally sustainable future doesn’t only lie with individuals and corporations. If companies aren’t willing to open up their systems so that people can ask “Why?”, then governments should put frameworks in place and force companies to adhere to those standards. In fact, by means of executing the legislative frameworks, the EU is really making a difference in how these systems are implemented, which is empowering organisations that want to adhere to digital sustainability principles. 

Moreover, especially in this era of ‘smart cities’, civil society actors such as municipalities need to become more mindful of their goals when implementing digital tools and technologies. Instead of thinking about what is actually needed to add public value, there has been a tendency to take whatever the market is offering. They should bear in mind that many tools are primarily offered in order to generate data flows to sustain companies’ commercial values rather than having citizens’ interests and welfare at heart.

Can you share some positive examples of digital sustainability in practice? 

Public values are increasingly part of the digital repertoire of public bodies when deciding on the acquisition and deployment of platforms, apps or tools. Universities, schools and public-sector organisations have become more aware of the need to protect privacy, security, autonomy and so on, as part of their societal identity and social task. For example, I was recently pleased to see that a university procurement procedure not only included public values one of the conditions for suppliers, but also made them a precondition for submitting a proposal. It’s an encouraging sign that such conditions are becoming a priority for organisations rather than an afterthought.

Meanwhile, it’s no secret that the Big Tech platforms are increasingly nestling themselves in school classrooms as part of the trend towards the personalisation of educational tools. I am not a great fan of this, to be honest. Although I understand the desire to enhance education at an individual level, I believe school should be a place where children learn to act as part of a community and develop social skills. There is an element of risk attached to only empowering children to do better as individuals; we also need to invest the same amount of effort into group dynamics and a sense of belonging. Therefore, I welcome the news that governments in various countries are banning the use of mobile phones in classrooms, and even sometimes at break times, in order to stimulate social interaction. 

Additionally, I was heartened to hear that Dutch schools recently joined forces to take a stand against the Big Tech firms. Following a Data Protection Impact Assessment (DPIA) under GDPR, the Dutch government imposed new conditions on the use of Microsoft and Google in classrooms, and within a month their offering had changed for the better. This is a great example of how we don’t have to be ‘rolled over’ by the big platforms. Shouldering your responsiblity starts with critical thinking about how digital tools are really adding value for the public good. And if they’re not, it can help to build coalitions and work together to leverage your negotiating power. There’s still time for us to effect real change.


Photo: Julie Blik

Let's get in touch

Ready to do business with the experts at INNOPAY?