In recent years, ChatGpt has exploded in popularity, with nearly 200 million users pumping over 1 billion prompts each day. These prompts may appear to complete requests from thin air.
But behind the scenes, artificial intelligence (AI) chatbots use enormous amounts of energy. In 2023, the data centers used for AI training and processing were responsible for 4.4% of the US electricity usage. Around the world, these centers account for around 1.5% of global energy consumption. These figures are expected to skyrocket by at least 2030 as demand for AI increases.
“Just three years ago, we didn’t have ChatGpt yet,” said Alex De Vries-Gao, sustainability researcher for emerging technologies at Vrije Universiteit Amsterdam and founder of DigiConomist, a platform for uncovering the unintended consequences of digital trends. “And now we’re talking about technology that is responsible for almost half the electricity consumption of data centers around the world.”
You might like it
But why are AI chatbots becoming more energy-intensive? The answer lies on the large scale of AI chatbots. In particular, Moshraf Choudhry, a computer scientist at the University of Michigan, says that there are two parts of AI that use the most energy.
Related: Why does electricity give humming noise?
Training an AI chatbot gives a huge dataset to a large language model (LLMS) so that AI can learn, recognize and make predictions. In general, there is a “big belief” in AI training, De Vries-Gao said.
“So what happens when you’re trying to train is that these days models are getting so big that they don’t fit into a single GPU. [graphics processing unit];They don’t fit on a single server,” Chowdhury told Live Science.
To demonstrate a sense of scale, a 2023 study by De Vries-Gao estimated that a single NVIDIA DGX A100 server requires up to 6.5 kilowatts of power. LLM training typically requires multiple servers, each with an average of 8 GPUs, running over weeks or months. Overall, this consumes a mountain of energy. It is estimated that Openai’s GPT-4 training used 50 gigawatt-hour energy equivalent to powering San Francisco for three days.
Inference also consumes a lot of energy. This is where the AI chatbot draws conclusions from what it has learned and generates output from the request. Although it requires quite a few computational resources to run after LLM is trained, inference is energy intensive due to the large number of requests made to the AI chatbot.
As of July 2025, Openai said ChatGPT users would send over 2.5 billion prompts every day. This means using multiple servers to generate instantaneous responses for these requests. This doesn’t even consider other widely used chatbots, including Google’s Gemini.
You might like it
“So, even in reasoning, you can’t really save energy,” Choudhry said. “It’s not really large data. So the model is already large, but there are a huge number of people using it.”
Researchers like Chowdhury and De Vries-Gao are currently working to understand how to better quantify these energy demands and reduce them. For example, Chowdhury holds an ML Energy Leaderboard that tracks the inference energy consumption of open source models.
However, the specific energy demands of other generation AI platforms are largely unknown. Large companies like Google, Microsoft and Meta keep these numbers secret or provide statistics that give little insight into the actual environmental impact of these applications, De Vries-Gao said. This makes it difficult to determine how much energy AI actually uses, what energy needs are in the coming years, and whether the world can keep up.
However, those using these chatbots can promote better transparency. This not only helps users make more energy-aware choices through their own AI use, but also helps them drive more robust policies that keep businesses accountable.
“One of the very fundamental issues with digital applications is that the impact is never transparent,” says De Vries-Gao. “The ball is with policymakers to encourage disclosure so that users can start something.”
Source link