Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

Amazon’s Ring to partner with Flock, an AI camera network used by ICE, federal government, and law enforcement

The real reason Google DeepMind partners with fusion energy startups

Rent a Cyber ​​Friend pays to talk to strangers online and shows off its platform at TechCrunch Disrupt 2025

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » Why do AI chatbots use so much energy?
Science

Why do AI chatbots use so much energy?

userBy userSeptember 14, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

In recent years, ChatGpt has exploded in popularity, with nearly 200 million users pumping over 1 billion prompts each day. These prompts may appear to complete requests from thin air.

But behind the scenes, artificial intelligence (AI) chatbots use enormous amounts of energy. In 2023, the data centers used for AI training and processing were responsible for 4.4% of the US electricity usage. Around the world, these centers account for around 1.5% of global energy consumption. These figures are expected to skyrocket by at least 2030 as demand for AI increases.

“Just three years ago, we didn’t have ChatGpt yet,” said Alex De Vries-Gao, sustainability researcher for emerging technologies at Vrije Universiteit Amsterdam and founder of DigiConomist, a platform for uncovering the unintended consequences of digital trends. “And now we’re talking about technology that is responsible for almost half the electricity consumption of data centers around the world.”

You might like it

But why are AI chatbots becoming more energy-intensive? The answer lies on the large scale of AI chatbots. In particular, Moshraf Choudhry, a computer scientist at the University of Michigan, says that there are two parts of AI that use the most energy.

Related: Why does electricity give humming noise?

Training an AI chatbot gives a huge dataset to a large language model (LLMS) so that AI can learn, recognize and make predictions. In general, there is a “big belief” in AI training, De Vries-Gao said.

“So what happens when you’re trying to train is that these days models are getting so big that they don’t fit into a single GPU. [graphics processing unit];They don’t fit on a single server,” Chowdhury told Live Science.

To demonstrate a sense of scale, a 2023 study by De Vries-Gao estimated that a single NVIDIA DGX A100 server requires up to 6.5 kilowatts of power. LLM training typically requires multiple servers, each with an average of 8 GPUs, running over weeks or months. Overall, this consumes a mountain of energy. It is estimated that Openai’s GPT-4 training used 50 gigawatt-hour energy equivalent to powering San Francisco for three days.

Inference also consumes a lot of energy. This is where the AI ​​chatbot draws conclusions from what it has learned and generates output from the request. Although it requires quite a few computational resources to run after LLM is trained, inference is energy intensive due to the large number of requests made to the AI ​​chatbot.

As of July 2025, Openai said ChatGPT users would send over 2.5 billion prompts every day. This means using multiple servers to generate instantaneous responses for these requests. This doesn’t even consider other widely used chatbots, including Google’s Gemini.

You might like it

“So, even in reasoning, you can’t really save energy,” Choudhry said. “It’s not really large data. So the model is already large, but there are a huge number of people using it.”

Researchers like Chowdhury and De Vries-Gao are currently working to understand how to better quantify these energy demands and reduce them. For example, Chowdhury holds an ML Energy Leaderboard that tracks the inference energy consumption of open source models.

However, the specific energy demands of other generation AI platforms are largely unknown. Large companies like Google, Microsoft and Meta keep these numbers secret or provide statistics that give little insight into the actual environmental impact of these applications, De Vries-Gao said. This makes it difficult to determine how much energy AI actually uses, what energy needs are in the coming years, and whether the world can keep up.

However, those using these chatbots can promote better transparency. This not only helps users make more energy-aware choices through their own AI use, but also helps them drive more robust policies that keep businesses accountable.

“One of the very fundamental issues with digital applications is that the impact is never transparent,” says De Vries-Gao. “The ball is with policymakers to encourage disclosure so that users can start something.”


Source link

#Biotechnology #ClimateScience #Health #Science #ScientificAdvances #ScientificResearch
Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous ArticleTesla’s board chairs spark controversy over Elon Musk’s one-dollar pay package “a little strange”
Next Article James Webb Telescope’s “Starlight Mountain Top” could still be the best image for the observatory – Space Photo of the Week
user
  • Website

Related Posts

5,000 years ago, Stone Age people in China fashioned the bones of their ancestors into cups and masks.

October 16, 2025

REM sleep may restructure our memories

October 15, 2025

Research reveals that Croatia’s skeleton-filled well likely contains the remains of Roman soldiers

October 15, 2025
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

Amazon’s Ring to partner with Flock, an AI camera network used by ICE, federal government, and law enforcement

The real reason Google DeepMind partners with fusion energy startups

Rent a Cyber ​​Friend pays to talk to strangers online and shows off its platform at TechCrunch Disrupt 2025

A new wave of social media apps brings hope to a world of doomscrolling

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

The AI Revolution: Beyond Superintelligence – TwinH Leads the Charge in Personalized, Secure Digital Identities

Revolutionize Your Workflow: TwinH Automates Tasks Without Your Presence

FySelf’s TwinH Unlocks 6 Vertical Ecosystems: Your Smart Digital Double for Every Aspect of Life

Beyond the Algorithm: How FySelf’s TwinH and Reinforcement Learning are Reshaping Future Education

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2025 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.