A datacenter dedicated to AI training and runtime processing (inference). The top companies developing AI (OpenAI, Google, Microsoft, Meta, xAI) have deployed server clusters with tens of thousands of GPUs for training large language models (LLMs). See
GPU.
There are many passes through the deep learning models in the training phase as well as to generate answers for users (inference), and this takes enormous amounts of computer power and electricity. Although the training and fine tuning of a model may take weeks and months, once that is accomplished, the model is then executed by the inference engine for millions of users all day long. See
inference engine and
large language model.
Fifty Times More Energy by 2030
A GPU server rack in a regular datacenter uses five to 10 kilowatts of power. However, AI datacenters generally require 60 kilowatts or more, and there can be a several thousand racks in each facility. When all phases are in operation, Elon Musk's Memphis datacenter will use 200 megawatts (see
Project Colossus). See
rack mounted.
It is estimated that by 2030 the energy required will be fifty times higher than 2025 because hundreds of datacenters are being built around the world for AI. In the U.S., areas in Wyoming, Indiana, Iowa, Texas, Oregon and Washington State are new locations. Because of its reliable power supply, Northern Virginia has experienced enormous datacenter growth.
Cooling Takes a Big Chunk
Depending on the type and age of equipment used, the cooling can take 20% to 40% of the electrical requirement. More modern facilities may require only 10% or less, but the local climate is a major factor. Cooler locations are better. See
datacenter cooling.
The Network Is a Huge Factor
In AI processing, all the GPUs work together, and the network that connects them is as important as the GPUs themselves.
Optical fibers have become the network medium of choice, and the development of light sources has become a critical factor. Gallium arsenide (GaAS) is increasingly packaged together with silicon whereby GaAs handles the light generation and silicon handles the control logic.
Nuclear Reactors
A potential source of electricity to meet this massive requirement is nuclear energy in the form of traditional reactors as well as their small modular counterparts, a fast-growing industry (see
small modular reactor). Pundits still have faith that, in time, nuclear fusion will save the day, not only for AI, but for the entire world (see
nuclear fusion).
Using Less Energy
While the talk of the town is the massive AI datacenters and energy usage, several AI companies are attempting to lower energy requirements with more efficient hardware and software. There are startups devoted entirely to saving energy; for example, Normal Computing claims its chip will reduce power for AI image generation by 1000% (see
Normal Computing).
Forget Earth!
To solve the energy requirement, Google plans to build AI datacenters in orbit. See
Project Suncatcher.