AI datacenters consume an enormous amount of electricity when models with billions and trillions of parameters are used in the training stage. According to estimates from the International Energy Association (IEA), training large AI models uses more electricity than 100 households in a year.
The inference stage (processing for the user) also takes a lot of computing time because, although obtaining an answer from a chatbot does not consume anywhere near the energy it took to create the chatbot's model, millions of people use the system worldwide and thousands are asking questions simultaneously.
AI vs. Crypto
AI power consumption has yet to rival Bitcoin mining, but it appears to be getting close. However, a huge percentage of Bitcoin mining comes from green energy, thus, as of 2024, energy consumption in AI datacenters may already be close to Bitcoin fossil fuel usage. The combination of both industries is a huge challenge for the U.S. electrical grid. See
AI datacenter.