top of page
  • Geoff Mulgan

The Decline of Transparency. Is the World Becoming More Opaque?

Have we passed peak transparency? Are we moving into an era when more of the things that shape our lives will be opaque? And if so, how should we cope? In this GB Geo-Blog post I argue that something fundamental is, indeed, changing in the world – from technology to business and geopolitics – and that ideas that served us well in the past need to be updated.

Transparency as an Unalloyed Good Most of us were brought up to believe that if only we could make things more transparent, the world would become a better place. Freedom of information laws; open data; free media – all were part of progress. They helped to ensure that governments and businesses served us better.

These ideas built on strong foundations. In the 19th century, Jeremy Bentham had argued that “without publicity, no good is permanent; under the auspices of publicity, no evil can continue”. John Stuart Mill advocated “unbounded publicity” and believed that a “marketplace of ideas”, with free and open competition, would bring the world closer to truth. The science system grew up around a parallel principle: that any new idea had to be published and opened up to commentary and criticism by peers.

And in governance, many took to heart Louis Brandeis’ famous comment that “sunlight is … the best of disinfectants”, such that it has become an article of faith that more transparency reduces corruption and abuses of power. Similar principles influenced global affairs. Back in the 1990s, the diplomat and theorist Robert Cooper argued that the European Union was pioneering what he called a “postmodern world order” in which security (and prosperity) flowed from mutual openness, whether in arms, trade or money. Mutual visibility was the key to security. If you could study your potential enemy’s troop movements, or nuclear weapons, you were at less risk of attack.

Cooper acknowledged that much of the world was in a different mental space. The big ‘modern’ states like Russia, China, India and the US still believed that security came from power and secrecy, and in the ‘pre-modern’ areas, functioning governments had collapsed. But the hope was that, over time, the post-modern ideal would spread through magnetic attraction.

Many fields of life seemed to confirm this worldview and the idea that if you can rationally design things, with transparent and unambiguous instructions, and input from a wider community, good things would result. That applied to engineering, software and also to constitutions. Secrecy, murk and mist were the enemies of progress.

The Trends to Opacity Much of this remains true, and I spend much of my life in the world of rational design, evidence, mapping and analysis. The methods of collective intelligence for which I advocate mobilize millions to improve science, policy-making or product design.

However, I fear that these assumptions now risk being fundamentally misleading. They no longer explain where the world is going and how the 21st century is likely to unfold.

The first field where this is evident is technology. The technologies on which we depend everyday are ever-less comprehensible and ever-more like black boxes. Artificial intelligence is becoming harder to understand even for the programmers. What happens in a Google search or translation algorithms is now far beyond the capacity of any one brain to understand. Algorithms now learn as much as they are programmed, and they interact with each other in ways that are hard to map.

The European Union is trying to legislate that any AI algorithms should be both transparent and explainable. But it has become clear that this may be very hard to achieve. Blockchain and crypto are also striking examples – understood only dimly even by some of the most successful entrepreneurs and investors (who will only admit this at 2am after a few drinks).

Many past technologies were quite complex: few can explain how a nuclear power station or even a modern car works. And yet the new opacity is of a different order, while technology is becoming ever-more like magic, and ever-less like a neat engineering diagram.

A second example is government and public policy. The turn of the century brought big strides for transparency around the world: Freedom of Information laws spread to dozens of countries, and were then followed up with moves to open up data in machine-readable form. The Open Government Partnership, at its peak, had nearly 80 governments signed up. Meanwhile, collective intelligence methods have helped citizens to investigate abuses, whether using satellite images to monitor human rights abuses in Xinjiang or Myanmar or mobile phone data to track down the failed assassins of Alexey Navalny.

However, over the last decade, things have also gone into reverse. FoI laws are now viewed much more sceptically than in the past, including by academics. My former boss, Tony Blair, concluded in his memoirs that he had been a “naïve, foolish, irresponsible, nimcompoop” to have introduced FoI. Leaders like Trump, Putin, Modi and Xi are instinctively suspicious of openness. The net result is that, in many countries, the daily workings of government are less rather than more visible than a generation ago.

This is particularly true in relation to R&D, as governments become more paranoid in response to rising geopolitical tension. A good case in point is the UK’s new technology agency ARIA, which goes live this year and aims to be an equivalent to the American DARPA. To the surprise of many, the UK government has just announced that, like the Royal Family, it will not be required to respond to FoI requests.

A third field where things are becoming opaque is business. Yes, there are continuing and welcome moves to open up accountability, and to make the environment and social impacts of business more visible. Projects like Open Corporates or the Panama Papers shine a light on activities and abuses. But technology development in business, which drives the modern economy, is more opaque than in the past. In the US, the R&D investment of top five tech firms is well over US$100 billion – more than that of all of the EU governments combined.

Alongside their Chinese equivalents – Alibaba, Tencent and Huawei – , these companies now dominate global tech development. And yet much of what they do is hidden. We can try to piece together what, for example, Meta is doing to shape the metaverse through published patents. But business tech is kept under wraps without any of the visibility, mutual awareness and critique that have underpinned the modern science system.

A fourth example of creeping opacity are the networks that we depend on – from food supply chains to cybersecurity. These are harder to grasp, more brittle and more surprising than their equivalents in the past. As a small experiment: Just ask a well-educated person to explain how the Internet works. Very few can. Or ask them to describe the supply chain for something like a car or computer. Or try to find out for yourself how the vital networks around you function day to day – from food supplies to money. You soon discover that these essential infrastructures are far less transparent than you expect, built not just around proprietary data but also organizing principles that are extremely hard to grasp. Again, these are ever-more like magic than like a wiring diagram.

So how do we cope, and how should we respond? One response is just to reassert the traditional liberal position that transparency on its own leads to good results. Many influential figures – like Steven Pinker or Bill Gates – do this, supported by leading media outlets like the The New York Times, The Economist or The Financial Times, and we should surely want to continue advocating for openness in science, government and business. But the risk is that such arguments are bypassed by history rather than responding in new ways to a new situation. We can try to open up all the black boxes. But this may be a losing game.

Another response is to strengthen the role of collective intelligence in countering moves to secrecy with more citizen science, open-source intelligence, and also beefing up institutions that can explain and interpret. This seems to me a valid position, and it justifies ensuring that organizations like the BBC or PBS are well-funded, and that the communication of science and technology is seen as a vital public service – that is, not just left to the vagaries of the market.

There is also an important new response for governments and regulators. To cope with opaque technologies, you need strong public institutions to observe, interpret and watch for risks and unintended consequences, rather than just relying on laws that quickly get superseded. This is the direction of travel for the regulation of AI in China and the EU, and UK institutions like the government’s Centre for Data Ethics and Innovation.

Finally, there is a challenge for education. I work in a university, teaching hundreds of engineering and policy students. Later this year, we will be launching an undergraduate degree fusing science, tech and public policy. Our ethos is highly rationalist, hopeful that with the right knowledge any problem – from climate change to inequality – can be solved.

Still, how should we feed in these new patterns of opacity and magic? I wish I was certain of the answer. Part of it may be a mindset switch that looks at outcomes as much as inputs, and accepts that we may often find an AI or complex system that broadly works even if we do not know why. We can try to nudge or guide it, but we do not need to pretend that we can turn it into a diagram, or that we live in a world where inputs lead to outputs and outcomes in a neatly causal way. My guess is that this is already just as true of global governance as it is of search engines.

So we need, more than ever, to bolster the forces of light against darkness, of transparency against secrecy. But we also need to be more humble and realistic about our ignorance than the champions of rationalist enlightenment imply. After all, wisdom is not just about what you know, but also about what you know you do not know.


bottom of page