Study: by the end of 2025, AI will consume more energy than bitcoin mining
By the end of the year, artificial intelligence will account for nearly half of the electricity consumed by data centers around the world. AI will thus surpass bitcoin mining in terms of energy consumption. This forecast was made by researcher Alex de Vries-Gao from the Free University of Amsterdam, writes The Verge.
According to de Vries-Gao, AI now accounts for about 20% of the electricity consumed by data centers. He admits that it is difficult to determine the exact figure: large IT companies are not in a hurry to share exact data on the power consumption of their AI models. De Vries-Gao used data on AI chip shipments for his analysis. He notes that only Taiwan Semiconductor Manufacturing Company, the largest AI chip maker, has more than doubled its production capacity between 2023 and 2024.
De Vries-Gao estimates that last year, AI equipment consumed as much electricity as the entire Netherlands. He predicts that by the end of 2025, this figure will rise to the level of the UK’s consumption, with total AI electricity demand reaching 23 GW.
The researcher draws two parallels between AI and cryptocurrencies. First, both technologies are evolving according to the “the bigger the better” model. Companies are constantly increasing the parameters of their models to create the “best one”, which naturally increases resource requirements. This “arms race” has led to a boom in the construction of new data centers designed specifically for AI. This is especially evident in the US, where there are more such data centers than in any other country. Energy companies are planning to build new power plants and nuclear reactors to meet the growing demand for electricity.
Another parallel to mining is the difficulty of assessing actual energy consumption and environmental impact. Many IT companies claim to have reduced greenhouse gas emissions, but they usually do not detail the data or show what percentage of emissions are related specifically to AI.
There is also the risk of the so-called Jevons paradox, where more efficient AI models will still consume more electricity because humans will simply use the technology more often.