Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

The EU accelerates its transition to a circular economy with new initiatives

A critical Cisco vulnerability in unified CM grants root access through static credentials

Openai blames Robinhood’s “Openai Tokens”

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » ChatGpt may not be as hunger for power as it once was supposed to
Startups

ChatGpt may not be as hunger for power as it once was supposed to

userBy userFebruary 11, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

Openai’s chatbot platform, ChatGpt, may not be as powerful as it once was supposed to. However, its appetite relies heavily on how ChatGpt is used, and the AI ​​model answering the queries, according to a new survey.

A recent analysis by Epoch AI, a non-profit AI research institute, attempted to calculate the amount of energy a typical ChatGPT consumes. A commonly cited statistic is that ChatGpt requires about 3 watts of power to answer a single question, i.e. 10 times more than Google searches.

Epoch believes it is an overrated one.

Using chatGPT’s latest default model of OpenAI, GPT-4o, as a reference, Epoch discovered that the average ChatGPT query consumes approximately 0.3 watt-hour consumption.

“Using energy is no big deal compared to using regular appliances, heating your home, driving a car or driving a car,” said Epoch, analysed. data analyst Joshua Yu told TechCrunch.

As AI companies are trying to quickly expand their infrastructure footprint, AI’s energy usage and its environmental impact are broadly the subject of debate. Last week, a group of over 100 organizations published an open letter to the AI ​​industry and regulators to ensure new AI data centers do not drain natural resources and make utilities rely on non-renewable energy sources. did.

He told TechCrunch that his analysis was spurred by what he characterized as an outdated previous study. For example, the authors of the report, which reached the 3-watt-era estimate, pointed out that Openai assumed that it used an old and inefficient chip to run the model.

Epoch AI ChatGpt Energy Consumption
Image credit: Epoch AI

“I have seen many public discourses that correctly recognize that AI will consume a lot of energy in the coming years, but in fact, the energy that is heading towards AI today is actually accurate. I didn’t explain it,” you said. “Also, some of my colleagues have found that the most widely reported estimates of 3 watts per query are based on rather old research, and are too high to be based on napkin mathematics. I realized that it seemed like that.”

Certainly, the 0.3 watt-hour value of the epoch is also approximate. Openai does not publish the details necessary to perform accurate calculations.

The analysis also does not take into account the additional energy costs caused by ChATGPT functions such as image generation and input processing. I’ve admitted that “long input” ChatGPT queries (for example, with long files attached) are more likely to consume more power in advance than a typical question.

However, he said he expects increased baseline ChatGpt power consumption.

“[The] AI is more advanced and training this AI probably requires much more energy. This future AI will be able to handle more and more complex tasks than today’s ChatGPT method, “You said.

While there have been significant breakthroughs in AI efficiency over the past few months, the scale of AI deployment is expected to drive the expansion of vast, power-hungry infrastructure. Over the next two years, AI data centers may need almost all of California’s 2022 power capacity (68 GW), according to a RAND report. By 2030, frontier model training could require power equivalent to the power of eight reactors (8 GW), the report predicted.

ChatGpt alone has reached a huge number of people growing, and server requests are in equal demand. Openai plans to spend billions of dollars on new AI data center projects over the next few years, along with several investment partners.

Openai’s attention, along with the rest of the AI ​​industry, has shifted to inference models. This is generally more capable in terms of tasks that can be accomplished, but requires more computing to be performed. In contrast to models like GPT-4O that responds almost instantly to queries, models infer “think” the process of sucking more computing, and thus the power sucking process for seconds to several minutes.

“Inference models will take on more and more tasks that older models can’t do, and will take on even more tasks. [data] To do so, both need more data centers,” you said.

Openai has begun releasing more power-efficient inference models like the O3-Mini. However, at least at this point, it appears unlikely that an increase in efficiency will offset the increased power demand due to the “thinking” process of inference models and the increased use of AI around the world.

You suggest that anyone concerned about the AI ​​energy footprint should use apps such as ChatGPT at all, or use models that select within the range of choices that minimize the required computing model. I did.

“You can try using smaller AI models [OpenAI’s] GPT-4O-MINI,” you said. “And use them sparingly in ways that require you to process or generate a large amount of data.”


Source link

Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous ArticleWisconsin education officials face challengers on both sides
Next Article In a change in crypto enforcement, SEC and Binance seek suspension in lawsuits | Crypto News
user
  • Website

Related Posts

Openai blames Robinhood’s “Openai Tokens”

July 2, 2025

Could Google’s VEO3 be the beginning of a playable world model?

July 2, 2025

It’s on track to raise $150 million at a $2 billion valuation

July 2, 2025
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

The EU accelerates its transition to a circular economy with new initiatives

A critical Cisco vulnerability in unified CM grants root access through static credentials

Openai blames Robinhood’s “Openai Tokens”

Meta’s Secret Weapon: The Superintelligence Unit That Could Change Everything 

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

Meta’s Secret Weapon: The Superintelligence Unit That Could Change Everything 

Unlocking the Power of Prediction: The Rise of Digital Twins in the IoT World

TwinH: Digital Human Twin Aims for Victory at Break the Gap 2025

The Digital Twin Revolution: Reshaping Industry 4.0

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2025 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.