The use of electricity by data centers in the US could triple over the next three years. By 2028, data centers could account for up to 12% of the country's total electricity consumption, according to Reuters, citing research from Lawrence Berkeley National Laboratory.
By 2028, data centers are projected to consume between 74 to 132 gigawatts of energy annually, comprising between 6.7% and 12% of the total energy consumption in the US.
The study accounted for various factors, including the availability and demand for AI chips like NVIDIA's H100, which are currently in high demand among companies developing large language models.
Overall, US electricity demand peaked in 2024 and is expected to set a new record in 2025. Currently, data centers account for slightly more than 4% of the country's total energy consumption.
The primary driver of increased electricity consumption is data centers dedicated to artificial intelligence. As of 2024, these centers contribute 2% to the total energy usage in the US. Data centers require substantial energy to power high-performance chips and efficient cooling systems.
Previously, it was reported that Google and Microsoft consumed 24 TWh of electricity in 2023, surpassing the energy consumption of over 100 countries worldwide.