The Cost of AI Mega-Factories: Energy Guzzlers, Water Wasters, and Taxpayer Burdens

The silent strain behind artificial intelligence
Beneath the dazzling surface of artificial intelligence, the chatbot interfaces, real-time translations, algorithmic medical breakthroughs, and predictive analytics powering everything from logistics to law enforcement, lurks a seldom-discussed yet profoundly important reality. Artificial intelligence, especially at the scale now pursued by global tech giants, is not intangible, clean, or ethereal. It does not live “in the cloud” as marketing language suggests. It lives in enormous, energy-hungry, water-guzzling data centers, which are increasingly referred to as AI mega-factories due to their industrial scale and massive physical resource consumption.
These facilities form the beating heart of AI development. They train, run, and update the large language models, neural networks, and deep learning algorithms that now permeate modern life. But to do so, they require a constant stream of electricity and vast quantities of water to keep the machines from overheating. This demand is not only unsustainable in ecological terms, it is often socially unjust. Why? Because the costs financial, environmental, and infrastructural are increasingly being passed on to ordinary taxpayers, while the profits accrue to a handful of hyper-concentrated tech corporations that enjoy government subsidies, favorable regulation, and minimal accountability.
AI’s insatiable hunger for electricity
The training of AI models, particularly so-called large language models (LLMs) such as OpenAI’s GPT-4, Google’s Gemini, Meta’s LLaMA, or Anthropic’s Claude, requires vast computational power. These computations are performed by specialized processors, most notably GPUs (graphics processing units) and TPUs (tensor processing units), which are energy-intensive and operate in parallel across vast server arrays.
While exact figures are often guarded by corporate secrecy, studies and investigative reports have offered chilling approximations. According to The New York Times and MIT Technology Review, a single data center supporting a major AI operation can consume between 20 and 50 megawatts of electricity, which is equivalent to the power required for 25,000 to 50,000 homes. This is not a one-time cost: these servers run constantly, every second of every day, drawing energy whether the data center is in Arizona, the Netherlands, Ireland, or Singapore. With dozens, soon hundreds, of such facilities popping up globally, AI is beginning to resemble a digital industrial revolution powered by fossil fuels, strained grids, and hidden subsidies.
This explosion in energy demand is already triggering ripple effects across national and regional power infrastructures. In the United States, utility companies are warning of rising blackouts due to the overwhelming electricity needs of AI data centers (A.I. Is Coming for Your Electricity, The New York Times). Power grids in Northern Virginia, for example, home to the largest concentration of data centers in the world, have been forced to delay residential developments due to lack of grid capacity. This means citizens are quite literally being deprioritized in favor of machine learning farms. The cost of grid upgrades, new substations, high-voltage power lines, and additional power plants? They are overwhelmingly socialized through higher taxes and increased utility rates, meaning households and small businesses pay more so that billion-dollar tech companies can operate seamlessly and profitably.
The fresh water crisis no one is talking about
While AI’s energy consumption is beginning to attract mainstream attention, its voracious appetite for clean water remains far less discussed, despite being equally alarming. Cooling the banks of servers required for AI is a critical operational necessity. When processors run, they generate heat, lots of it. If not cooled continuously, they overheat and degrade in performance or shut down entirely. To address this, most data centers rely on evaporative cooling systems, which pull in clean, potable water, cycle it through the system, and allow it to evaporate to carry heat away.
This process, repeated endlessly, consumes astronomical volumes of water. According to a 2023 Bloomberg report, the training of GPT-4 alone consumed over 700,000 liters of clean drinking water, and that is just one model, trained once. Across the globe, there are now hundreds of models being trained repeatedly, across multiple time zones and languages, with billions more queries being processed each day. In locations like Iowa (where Microsoft’s data centers operate), Spain, India, and California, where water scarcity is already a serious issue, the diversion of municipal drinking water to AI factories is nothing short of dystopian. Communities are being urged to conserve water, limit showers, reduce irrigation, stop washing cars, while corporations are allowed to siphon off millions of liters to cool machines that generate profit for shareholders thousands of kilometers away.
Government handouts, corporate secrecy
What makes this situation worse is that the companies operating these AI mega-factories are not paying the full cost of their environmental impact. In many jurisdictions, they receive generous government incentives: tax breaks, land grants, subsidies for electricity, and even priority access to clean water. Local and national governments often justify these deals by promoting “job creation,” “technological advancement,” or “digital leadership.” But the jobs created are minimal, data centers are largely automated and the benefits are disproportionately captured by foreign-owned tech giants that route their profits through tax havens.
Take the Netherlands, for example. Microsoft’s data center in Hollands Kroon was granted access to vast tracts of clean water and electrical infrastructure at minimal cost, despite widespread public opposition and environmental concerns (Microsoft en het datacenter-debat, NOS). Similarly, in Denmark and Ireland, local farmers and residents have raised alarm over the reallocation of resources away from food production and community services toward mega data hubs operated by Amazon, Meta, or Google.
In essence, we are witnessing the quiet privatization of public resources, electricity, water, land, on behalf of powerful corporations that are neither transparent nor accountable. These corporations increasingly act as digital landlords, extracting profit from infrastructure built and maintained by public funds, while dodging scrutiny under the guise of innovation.
Solutions ignored: nuclear and seawater cooling
If we are to continue expanding artificial intelligence infrastructure and given the political and economic momentum, that seems inevitable, then it must be done with a radically different approach. The two most critical pillars of a sustainable path forward are:
1. Immediate and large-scale investment in nuclear energy
The world needs to stop pretending that intermittent renewables alone will power the AI age. Wind and solar, while crucial, cannot deliver 24/7 baseload power at the scale required by data centers without relying on backup fossil fuels. That is an inconvenient truth that climate activists, policymakers, and corporations must confront. Nuclear energy is the only scalable, zero-carbon power source that can meet AI’s nonstop demand without further damaging the environment.
Unfortunately, public policy in many Western countries remains stuck in a Cold War-era fear of nuclear power. Plants take too long to build, red tape is overwhelming, and political will is lacking. Meanwhile, China is forging ahead, building modern nuclear reactors at a breakneck pace to ensure its AI development remains untethered by energy limitations. The West must catch up, or remain dependent on unstable fossil fuel imports, unreliable wind patterns, and fragile grid systems.
2. Transition to seawater or alternative non-potable water cooling systems
Equally urgent is the shift away from using drinking water to cool machines. For data centers near oceans, rivers, or large bodies of water, there is no excuse not to use seawater cooling or desalinated water. Many industrial plants, such as coastal power stations and chemical factories, already use once-through cooling systems that draw in seawater, cycle it, and release it with minimal environmental impact when properly regulated.
Governments must mandate that new AI data centers be designed with non-potable cooling systems, or risk permanent damage to freshwater ecosystems. The additional cost of building seawater intake systems or desalination units is negligible when compared to the damage of depleting local water tables and depriving communities of vital drinking supplies.
Who pays? You do.
At the end of the pipeline, quite literally, stands the taxpayer. Through higher energy bills, increased water tariffs, underfunded infrastructure, and the social costs of environmental degradation, ordinary citizens are being forced to subsidize a private technological revolution. The risks are not theoretical: blackouts, droughts, food shortages, housing delays, and rising public debt are already evident in regions that have become AI infrastructure hubs.
This is a textbook case of privatized gains and socialized losses, a pattern we have seen before with the fossil fuel industry, the financial sector, and Big Pharma. But this time, the pace of expansion is faster, the opacity greater, and the public awareness disturbingly low.
The way forward
It is time for a new political and environmental compact around AI. One that recognizes not just the power of artificial intelligence but its real-world physical footprint, and demands accountability. AI companies must be required to:
- Pay the full cost of their water and electricity usage
- Build or invest in local nuclear generation capacity if they wish to expand
- Use non-potable water for all future cooling systems
- Be transparent about energy and water consumption metrics
- Contribute to community infrastructure in meaningful ways beyond token donations
Only with firm regulation, forward-looking energy policy, and public scrutiny can we ensure that artificial intelligence serves the common good without quietly exhausting the very resources we all depend on to survive.
The alternative is a future of digital abundance built atop ecological scarcity and economic inequality—a world in which machines are cooled while humans go thirsty.