Close Menu
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
What's Hot

So far, at least 36 new technology unicorns have been cast in 2025

How Brex is catching up to AI by embracing “confusion”

Adaptive Reuse Creates Homes in Suburban Texas Strip Malls

Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
Facebook X (Twitter) Instagram
Fyself News
  • Home
  • Identity
  • Inventions
  • Future
  • Science
  • Startups
  • Spanish
Fyself News
Home » New TUM training model reduces AI energy consumption
Inventions

New TUM training model reduces AI energy consumption

userBy userMarch 7, 2025No Comments4 Mins Read
Share Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Copy Link
Follow Us
Google News Flipboard
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

However, the energy consumption of AI systems, particularly large-scale language models (LLM), has raised concerns about sustainability.

These systems rely on data centers that require huge amounts of power for computing, storage and data transmission. In Germany alone, data centers consumed approximately 16 billion kWh in 2020. This accounts for about 1% of the country’s total energy usage.

By 2025, this figure is projected to rise to 22 billion kWh, reflecting an increase in demand for AI-powered services.

To address this issue, experts at the Institute of Technology Munich (TUM) have developed a new training method that significantly reduces AI energy consumption.

What drives AI energy consumption?

It is becoming increasingly clear that AI energy consumption poses important environmental challenges.

The core of this problem lies in the computational power required to train and operate advanced AI models. These models need to handle vast datasets that lead to long-term, intensive use of powerful hardware such as GPUs and TPUs that consume large amounts of power.

This high energy demand is further amplified by its reliance on AI operations in data centers, requiring considerable power for both calculations and cooling.

Research from sources like built-in shows that the energy used to generate a single image from an AI image generator equals the energy used to fully charge the smartphone. This gives a concrete example of AI’s power consumption.

Furthermore, the International Energy Agency (IEA) emphasizes that interactions with AI systems like ChatGPT can consume significantly more power than standard search engine queries.

The IEA also said that increased power consumption by data centers, cryptocurrencies and AI from 2022 to 2026 could be on par with Sweden or Germany’s electricity consumption. This highlights the scale of AI energy consumption.

Additionally, the report projects a significant increase in data center energy consumption over the coming years, driven primarily by the proliferation of AI.

For example, McKinsey & Company expects that electricity demand for US data centers will reach 606 terawatt hours (TWH) by 2030 by 2030. Increases from 147 TWH in 2023.

To address this challenge, TUM researchers have developed innovative training methods that are 100 times faster, while maintaining accuracy comparable to existing methods.

This breakthrough could significantly reduce AI energy consumption, making adoption of large-scale AI more sustainable.

Understanding neural networks

AI systems rely on artificial neural networks inspired by the human brain. These networks are made up of interconnect nodes (artificial neurons) that process the input signals.

Each connection is weighted with a specific parameter, and a signal is passed forward when the input exceeds a threshold.

Training neural networks involves repeated adjustments to improve prediction. However, this process is computationally expensive and contributes to high power usage.

More efficient training methods

Felix Dietrich, a professor specializing in physics-enhanced machine learning, and his research team have introduced an innovative approach to neural network training.

Instead of relying on traditional iterative methods, the method employs probabilistic parameter selection.

This method focuses on identifying key points in training data and strategically assigning values ​​based on probability distributions when rapid and significant changes occur.

By targeting the main locations within the dataset, this approach dramatically reduces the number of required iterations, leading to significant energy savings.

Real World Applications

This new training technique has great potential for a wide range of applications. Energy-efficient AI models can be used in climate modeling, financial market analysis, and other dynamic systems that require rapid data processing.

By reducing the energy footprint of AI training, this approach not only reduces operational costs, but also aligns AI development with global sustainability goals.

The future of environmentally friendly AI

The rapid expansion of AI applications requires a sustainable approach to energy consumption.

As data centers are expected to use more energy-efficient training methods, it is important to adopt. The breakthroughs by the TUM team illustrate key steps to making AI greener without compromising performance.

As technology evolves, such innovations will play a pivotal role in shaping a more sustainable digital future.


Source link

Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link
Previous ArticleTrump’s first Crypto Summit is ready for participants to seek restrictions
Next Article DNA sequencing reduces diagnostic times and curbs AMR
user
  • Website

Related Posts

How the Green Energy Transfer Will Increase Productivity in the UK

July 4, 2025

Intestinal bacteria can cleanse the body of toxic PFAS chemicals

July 4, 2025

BPX steering group to advance battery modeling standards

July 4, 2025
Add A Comment
Leave A Reply Cancel Reply

Latest Posts

So far, at least 36 new technology unicorns have been cast in 2025

How Brex is catching up to AI by embracing “confusion”

Adaptive Reuse Creates Homes in Suburban Texas Strip Malls

Investigation: Anti-homelessness laws don’t work

Trending Posts

Subscribe to News

Subscribe to our newsletter and never miss our latest news

Please enable JavaScript in your browser to complete this form.
Loading

Welcome to Fyself News, your go-to platform for the latest in tech, startups, inventions, sustainability, and fintech! We are a passionate team of enthusiasts committed to bringing you timely, insightful, and accurate information on the most pressing developments across these industries. Whether you’re an entrepreneur, investor, or just someone curious about the future of technology and innovation, Fyself News has something for you.

TwinH: A New Frontier in the Pursuit of Immortality?

Meta’s Secret Weapon: The Superintelligence Unit That Could Change Everything 

Unlocking the Power of Prediction: The Rise of Digital Twins in the IoT World

TwinH: Digital Human Twin Aims for Victory at Break the Gap 2025

Facebook X (Twitter) Instagram Pinterest YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact Us
  • DMCA
  • Privacy Policy
  • Terms & Conditions
  • User-Submitted Posts
© 2025 news.fyself. Designed by by fyself.

Type above and press Enter to search. Press Esc to cancel.