Cloud Illusions: the AI Production Chain

Introduction

We do love a gentle metaphor, don't we?

When it came time to choose vocabulary to describe AI, we stuck with an approach already firmly in place. Nebulous, up there in the air, unearthly, light. It's comforting to think of all things internet-related as being soft and silent. Perhaps it feels safer that way, too - both because the concept becomes less real (and light on details) but also because it creates a space to add mystery or godliness. It feels knowledgeable and benevolent.   

While this metaphor is comforting, it's also wildly inaccurate and entirely misleading. The very existence of AI relies on heavy-duty, dirty, demanding, earth-bound objects. It needs clean water, minerals, infrastructure, data centres, and so many people. It needs more people than the vocabulary would suggest. While the language around clouds and air evokes a positive response, the reality is purposefully kept obscured lest it ruin the illusion and compel us to act. 

I call this the Cloud Illusion because it reminds of my Joni Mitchell's intro verse in Both Sides Now. (I think she was right. We really don't know clouds at all.)  

All that is old is new. Again. 

One amusing thing about the AI industry is how old-fashioned it is in some respects. Like many industrial 'revolutions' before, this one continues to maintain the same old structures to exert power and authority over exploited and alienated workers. In her book Empire of AI, Karen Hao compares the former colonial approach to labour exploitation with the current Big Tech approach. She highlights how similar both are in going to the global south for labour that can remain hidden from end users. In addition to cheap labour, it's also much easier to collect data when privacy protections are weaker. Even when found to be engaged in harmful practices, it’s also easier for the tech company to buy it's way out of trouble in poorer countries. The final outcomes mimic the colonial days of yesteryear, too. Those who profit live a comfortable distance away in the global north, in a soft fluffy life of privilege. Those who do the dirty work see very little in return.   

In their book Ghost Work, authors Mary L. Gray and Siddharth Suri use the phrase to explain how AI is not happening up in the clouds somewhere. Rather, it’s labour that is done with and between bodies. This labour, which appears magical and cloud-like to end-users, requires real people doing unending amounts of piecework for low pay. This is normally done through a digital labour platform and is referred to as 'crowdwork'. The employees work under constant surveillance and normally have few or no workers' rights. The tasks done by real humans to keep the Cloud Illusion alive include flagging inappropriate content, proofreading and transcribing audio, chatting with users as a romantic chatbot, labelling and categorising products on Amazon to improve the supposed AI-powered search - the list goes on. The workers usually earn less than legal minimums for traditional work, do not get health benefits and, unsurprisingly, have zero job security. Often, there is an additional level of obfuscation added as Big Tech companies outsource their need for workers to do these low-paying jobs to an intermediary company. When it comes to knowing who they work for, labourers are in the dark. 

Gray also expands upon how much we get wrong about work when we reduce tasks to a set of actions that a new technology can easily replace. You can hear her talk more about this in a podcast here

Down in the Dirt 

Another amusing aspect of the Cloud Illusion is its complete reliance on earth to support the AI production chain. Insofar as data worker exploitation is understood, the discourse has shifted in the past couple of years, both in scholarly journals and in writing done for a broader public. The same cannot (yet) be said for the whole expanse of the production chain, as outlined in a recently published paper by Waelen and Deranty (2026) called “Making AI Work: A Critical Theory of AI Production”. In it, the authors point out a research gap vis-à-vis the range and interconnectedness of human activity required to make AI possible. Indeed, they chose the phrase 'AI production chain' to draw attention to human activity (i.e. production) and how it's connected (i.e. like a chain). That chain begins by extracting and refining the necessary minerals to make all manner of hardware. What is often described with language conjuring air and light actually begins deep down in the dirt. And, it is helpfully kept out of sight and out of mind as this part of the production chain occurs in places we will never visit. 

To illustrate what minerals are required to build components for the circuitry and heat sinks responsible for protecting and enabling delicate microchips to function, this infographic by the US Geological Survey is handy. The list of minerals gives you a sense of how deep in the dirt AI really is. Similarly, it provides a sense of terra firma interconnectedness as many of the minerals are imported. Once again, the countries and the people involved in making AI exist are kept in the shadows. As an example, a large percentage the minerals are imported such as tin (75%) and tantalum (100%). These essential bits of earth are abundant in Central Africa, and are classified as conflict minerals. In a country like the DRC where, according to recent reporting by the ILO, "nearly 86% of jobs are in the informal economy, whilst only 5% of the population benefits from a social protection scheme (and are) mainly workers in the formal sector", it seems unlikely this work could be considered decent.     

Life's Illusions

We actively use a lot of illusions. Maybe we seek them out to keep from coming undone while navigating a life that can be, according to Hobbes, solitary, poor, nasty, brutish, and short. Invariably, we engage in illusions unknowingly, too - particularly when those illusions are maintained through serious effort to keep unpalatable realities in the dark, hidden from view because they threaten to hurt the bottom line.

Those illusions cannot be our life's illusions. We mustn't participate in their upkeep because they are the ones that require so many workers along the AI production chain to remain in a life that is brutish. To pro-actively shape future work that is decent and recognised, we can start by using accurate language to describe the reality of the interconnected demands to produce AI currently. And we can continue within that accurate language to identify what needs changing and how to get there - not up in the air, but very much down here on the ground.

We have to look at Life's Illusions from every side and admit that AI was never really in the clouds, at all.   

Works Cited

Decent work and the 2030 Agenda for sustainable development. (2026, March 19). International Labour Organization. https://www.ilo.org/topics-and-sectors/decent-work-and-2030-agenda-sustainable-development

Gonzalez-Cabello, M., Siddiq, A., Corbett, C. J., & Hu, C. (2024). Fairness in crowdwork: Making the human AI supply chain more humane. Business Horizons, 68(5), 645–657. https://doi.org/10.1016/j.bushor.2024.09.003

Gray, M. L., & Suri, S. (2019). Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. Harper Business.

Griffiths, M., & El-Shewy, M. (2025). How ‘conflict-free’ minerals are used in the waging of modern wars. The Conversation. https://doi.org/10.64628/ab.9uekfmagc

Hao, K. (2025). Empire of AI: Dreams and Nightmares in Sam Altman’s OpenAI. Penguin.

Waelen, R., and J.-P.Deranty. 2026. “Making AI Work: A Critical Theory of AI Production.” Constellations. https://doi.org/10.1111/1467-8675.70053

Next
Next

Supporting or Thwarting? Basic Needs at Work