A datacenter dedicated to AI training and processing (inference). The deep learning phase uses enormous amounts of computer power and electricity, and datacenters may be built precisely for that purpose. A server rack in a traditional datacenter uses from five to 10 kilowatts of power; however, AI datacenters generally require 60 kilowatts or more per rack. The top companies developing AI (xAI, Microsoft, etc.) are expected to deploy server clusters with more than a hundred thousand GPUs for training large language models (LLMs). See
GPU and
Blackwell.