Badische Presse - Tech giants scramble to meet AI's looming energy crisis

NYSE - LSE
CMSC 0.4% 22.314 $
CMSD 0.11% 22.285 $
RBGPF 0% 69.04 $
SCS 0.37% 10.74 $
RELX 0.06% 53 $
RIO -0.24% 59.33 $
GSK 0.31% 41.45 $
NGG 0.38% 71.48 $
BP 0.58% 30.4 $
BTI 1.48% 48.215 $
BCC 0.87% 91.02 $
JRI 0.15% 13.13 $
VOD 0.1% 9.85 $
BCE -0.27% 22.445 $
RYCEF 0.83% 12 $
AZN -0.16% 73.71 $
Tech giants scramble to meet AI's looming energy crisis
Tech giants scramble to meet AI's looming energy crisis / Photo: © AFP/File

Tech giants scramble to meet AI's looming energy crisis

The artificial intelligence industry is scrambling to reduce its massive energy consumption through better cooling systems, more efficient computer chips, and smarter programming -- all while AI usage explodes worldwide.

Text size:

AI depends entirely on data centers, which could consume three percent of the world's electricity by 2030, according to the International Energy Agency. That's double what they use today.

Experts at McKinsey, a US consulting firm, describe a race to build enough data centers to keep up with AI's rapid growth, while warning that the world is heading toward an electricity shortage.

"There are several ways of solving the problem," explained Mosharaf Chowdhury, a University of Michigan professor of computer science.

Companies can either build more energy supply -- which takes time and the AI giants are already scouring the globe to do -- or figure out how to consume less energy for the same computing power.

Chowdhury believes the challenge can be met with "clever" solutions at every level, from the physical hardware to the AI software itself.

For example, his lab has developed algorithms that calculate exactly how much electricity each AI chip needs, reducing energy use by 20-30 percent.

- 'Clever' solutions -

Twenty years ago, operating a data center -- encompassing cooling systems and other infrastructure -- required as much energy as running the servers themselves.

Today, operations use just 10 percent of what the servers consume, says Gareth Williams from consulting firm Arup.

This is largely through this focus on energy efficiency.

Many data centers now use AI-powered sensors to control temperature in specific zones rather than cooling entire buildings uniformly.

This allows them to optimize water and electricity use in real-time, according to McKinsey's Pankaj Sachdeva.

For many, the game-changer will be liquid cooling, which replaces the roar of energy-hungry air conditioners with a coolant that circulates directly through the servers.

"All the big players are looking at it," Williams said.

This matters because modern AI chips from companies like Nvidia consume 100 times more power than servers did two decades ago.

Amazon's world-leading cloud computing business, AWS, last week said it had developed its own liquid method to cool down Nvidia GPUs in its servers - - avoiding have to rebuild existing data centers.

"There simply wouldn't be enough liquid-cooling capacity to support our scale," Dave Brown, vice president of compute and machine learning services at AWS, said in a YouTube video.

- US vs China -

For McKinsey's Sachdeva, a reassuring factor is that each new generation of computer chips is more energy-efficient than the last.

Research by Purdue University's Yi Ding has shown that AI chips can last longer without losing performance.

"But it's hard to convince semiconductor companies to make less money" by encouraging customers to keep using the same equipment longer, Ding added.

Yet even if more efficiency in chips and energy consumption is likely to make AI cheaper, it won't reduce total energy consumption.

"Energy consumption will keep rising," Ding predicted, despite all efforts to limit it. "But maybe not as quickly."

In the United States, energy is now seen as key to keeping the country's competitive edge over China in AI.

In January, Chinese startup DeepSeek unveiled an AI model that performed as well as top US systems despite using less powerful chips -- and by extension, less energy.

DeepSeek's engineers achieved this by programming their GPUs more precisely and skipping an energy-intensive training step that was previously considered essential.

China is also feared to be leagues ahead of the US in available energy sources, including from renewables and nuclear.

K.Wolf--BP