Openai’s chatbot platform, ChatGpt, may not be as powerful as it once was supposed to. However, its appetite relies heavily on how ChatGpt is used, and the AI model answering the queries, according to a new survey.
A recent analysis by Epoch AI, a non-profit AI research institute, attempted to calculate the amount of energy a typical ChatGPT consumes. A commonly cited statistic is that ChatGpt requires about 3 watts of power to answer a single question, i.e. 10 times more than Google searches.
Epoch believes it is an overrated one.
Using chatGPT’s latest default model of OpenAI, GPT-4o, as a reference, Epoch discovered that the average ChatGPT query consumes approximately 0.3 watt-hour consumption.
“Using energy is no big deal compared to using regular appliances, heating your home, driving a car or driving a car,” said Epoch, analysed. data analyst Joshua Yu told TechCrunch.
As AI companies are trying to quickly expand their infrastructure footprint, AI’s energy usage and its environmental impact are broadly the subject of debate. Last week, a group of over 100 organizations published an open letter to the AI industry and regulators to ensure new AI data centers do not drain natural resources and make utilities rely on non-renewable energy sources. did.
He told TechCrunch that his analysis was spurred by what he characterized as an outdated previous study. For example, the authors of the report, which reached the 3-watt-era estimate, pointed out that Openai assumed that it used an old and inefficient chip to run the model.
“I have seen many public discourses that correctly recognize that AI will consume a lot of energy in the coming years, but in fact, the energy that is heading towards AI today is actually accurate. I didn’t explain it,” you said. “Also, some of my colleagues have found that the most widely reported estimates of 3 watts per query are based on rather old research, and are too high to be based on napkin mathematics. I realized that it seemed like that.”
Certainly, the 0.3 watt-hour value of the epoch is also approximate. Openai does not publish the details necessary to perform accurate calculations.
The analysis also does not take into account the additional energy costs caused by ChATGPT functions such as image generation and input processing. I’ve admitted that “long input” ChatGPT queries (for example, with long files attached) are more likely to consume more power in advance than a typical question.
However, he said he expects increased baseline ChatGpt power consumption.
“[The] AI is more advanced and training this AI probably requires much more energy. This future AI will be able to handle more and more complex tasks than today’s ChatGPT method, “You said.
While there have been significant breakthroughs in AI efficiency over the past few months, the scale of AI deployment is expected to drive the expansion of vast, power-hungry infrastructure. Over the next two years, AI data centers may need almost all of California’s 2022 power capacity (68 GW), according to a RAND report. By 2030, frontier model training could require power equivalent to the power of eight reactors (8 GW), the report predicted.
ChatGpt alone has reached a huge number of people growing, and server requests are in equal demand. Openai plans to spend billions of dollars on new AI data center projects over the next few years, along with several investment partners.
Openai’s attention, along with the rest of the AI industry, has shifted to inference models. This is generally more capable in terms of tasks that can be accomplished, but requires more computing to be performed. In contrast to models like GPT-4O that responds almost instantly to queries, models infer “think” the process of sucking more computing, and thus the power sucking process for seconds to several minutes.
“Inference models will take on more and more tasks that older models can’t do, and will take on even more tasks. [data] To do so, both need more data centers,” you said.
Openai has begun releasing more power-efficient inference models like the O3-Mini. However, at least at this point, it appears unlikely that an increase in efficiency will offset the increased power demand due to the “thinking” process of inference models and the increased use of AI around the world.
You suggest that anyone concerned about the AI energy footprint should use apps such as ChatGPT at all, or use models that select within the range of choices that minimize the required computing model. I did.
“You can try using smaller AI models [OpenAI’s] GPT-4O-MINI,” you said. “And use them sparingly in ways that require you to process or generate a large amount of data.”
Source link