OPINION: Massive AI Data Centers Everywhere—The Compute Power Race and Its Impact

Image generated with ChatGPT

OPINION: Massive AI Data Centers Everywhere—The Compute Power Race and Its Impact

Reading time: 6 min

This year, tech giants like Nvidia, OpenAI, Microsoft, Google, and Alibaba have rolled out multibillion-dollar plans to build the infrastructure needed for today’s and tomorrow’s AI tech. But why is this infrastructure necessary, and how are we managing its consequences?

We are entering an era defined by the construction of massive AI data centers—everywhere. This year alone, several multibillion-dollar deals have been announced to build the infrastructure that will power the advanced AI technology being developed by the world’s tech giants.

All the major players—from Nvidia to Alibaba, OpenAI to Google, and Microsoft—are moving quickly: signing new agreements, making enormous investments, and securing land in unlikely locations to begin construction as soon as possible.

In January, Digital Edge raised $1.6 billion to expand its data centers in Singapore to meet growing AI demands across Asia. In the United States, OpenAI, Softbank, Oracle, and the White House announced a $500 billion joint venture: the Stargate Project.

Only days later, France and the United Arab Emirates announced a new partnership—worth €30 to 50 billion—to build a 1-gigawatt AI-dedicated data center in France, while Alibaba announced a $52 billion investment in AI infrastructure.

France’s brief advantage in AI infrastructure lasted only a few months. In July, Anker and Nscale announced a partnership with OpenAI to build Europe’s largest AI data center in Norway. This project, Stargate Norway, stood out for being powered entirely by renewable energy—a crucial point given that energy consumption is one of the most pressing concerns.

The Stargate “franchise” continues to expand with bold new moves. Just weeks ago, it was reported that OpenAI is planning on building a 1-gigawatt data center in India. Shortly after, the Stargate UK program was announced, aiming to accelerate AI development in the region with Nvidia pledging up to $15 billion in chips and Microsoft committing $30 billion for supercomputer development.

And, as if these deals weren’t enough, Nvidia announced last week it will invest $100 billion more in OpenAI to build more data centers.

Much like a game of Catan, the locations and strategies are now visible on the board. The scramble for resources to build AI empires has begun.

But… What Is an AI Data Center?

Despite the warnings of a potential AI bubble, companies either dismiss these concerns or consider the risk of investing billions worth taking. We’ve all heard of data centers before—we know that Amazon, Google, and other cloud providers rely on these vast infrastructures to keep our digital lives running. But what changes with the rise of generative AI?

AI data centers are large-scale facilities built to house the IT infrastructure needed to support advanced technologies. As IBM explains, beyond the traditional servers, networking equipment, and storage units, AI data centers require specialized systems capable of handling far more demanding workloads.

“Typical data centers contain infrastructure that would quickly be overwhelmed by AI workloads,” states IBM on its website. “AI-ready infrastructure is specially designed for the cloud, AI, and machine learning tasks.”

One of the biggest differences lies in the hardware. Instead of relying primarily on central processing units (CPUs), AI data centers are built around graphics processing units (GPUs)—the chips that have fueled Nvidia’s popularity in recent valuations.

And the build-out is only expected to accelerate. According to McKinsey, demand for AI data centers in the United States will triple by 2030, requiring roughly $7 trillion in new investment. But are these predictions realistic? Do we truly need all the data centers now being planned?

The Side Effects

While these powerful facilities can fuel the AI magic we’ve all come to admire, they also demand enormous amounts of energy. That translates into more pollution, higher energy costs—including for local communities—and deeper concerns about the future of the environment.

AI data centers require vast amounts of electricity, complex cooling systems, and cutting-edge hardware running around the clock under optimal conditions. Renewable energy, however, still lacks a reputation for delivering the consistency needed to meet such demands.

The government of the United States has already taken steps to permit more fossil fuel use to power these centers. And while projects like Stargate Norway have prioritized renewables, building truly “green” ecosystems at this scale is far from simple.

“To continuously produce just a single gigawatt, a renewable-energy plant would need around 12.5 million solar panels — enough to cover nearly 5,000 football fields,” states a recent report from the New York Times. In their latest partnership, Nvidia and OpenAI announced they expect “at least 10 gigawatts of AI data centers.”

The Neighbors Are Already Complaining

These massive facilities also consume significant water for their cooling systems. While Sam Altman—OpenAI’s CEO—insists that ChatGPT queries only consume “roughly one fifteenth of a teaspoon” of water, many citizens living next to these data centers complain not only about the increasing energy bills, but also about water scarcity.

“Large data centers can consume up to 5 million gallons per day, equivalent to the water use of a town populated by 10,000 to 50,000 people,” states a report on water consumption shared by the Environmental and Energy Study Institute in June. And people are already worried.

In Georgia, residents have already complained about groundwater conditions. A woman living near Meta’s data center reported that her water had become hazy and said she fears it could run out within a few years. Others living near similar facilities say their electric bill costs have significantly increased.

A Game Of Titans

The most important pieces of the game are already on the board. The largest AI data centers at the moment are in the United States, China, and Europe, as well as in other regions. The competition over who manages to develop the energy source that will power new technologies is at a point of high tension.

The consequences of developing these data centers are still to be seen, and many factors are at play. From environmental and human consequences—given how they are already affecting the lives of people living near these places—to financial ones. Some experts are also questioning whether the new AI models, as developers of DeepSeek have shown, might not even need that much energy.

The stakes are very high, and the desire for them to succeed is even greater. While some benefit from the jobs these data centers are generating—AI is not yet performing the manual labor needed for construction—others worry about their basic needs, such as having drinking water. Ambition seems to be the piece with the greatest power on the board.

Did you like this article? Rate it!
I hated it I don't really like it It was ok Pretty good! Loved it!

We're thrilled you enjoyed our work!

As a valued reader, would you mind giving us a shoutout on Trustpilot? It's quick and means the world to us. Thank you for being amazing!

Rate us on Trustpilot
0 Voted by 0 users
Title
Comment
Thanks for your feedback