However, the energy consumption of AI systems, particularly large-scale language models (LLM), has raised concerns about sustainability.
These systems rely on data centers that require huge amounts of power for computing, storage and data transmission. In Germany alone, data centers consumed approximately 16 billion kWh in 2020. This accounts for about 1% of the country’s total energy usage.
By 2025, this figure is projected to rise to 22 billion kWh, reflecting an increase in demand for AI-powered services.
To address this issue, experts at the Institute of Technology Munich (TUM) have developed a new training method that significantly reduces AI energy consumption.
What drives AI energy consumption?
It is becoming increasingly clear that AI energy consumption poses important environmental challenges.
The core of this problem lies in the computational power required to train and operate advanced AI models. These models need to handle vast datasets that lead to long-term, intensive use of powerful hardware such as GPUs and TPUs that consume large amounts of power.
This high energy demand is further amplified by its reliance on AI operations in data centers, requiring considerable power for both calculations and cooling.
Research from sources like built-in shows that the energy used to generate a single image from an AI image generator equals the energy used to fully charge the smartphone. This gives a concrete example of AI’s power consumption.
Furthermore, the International Energy Agency (IEA) emphasizes that interactions with AI systems like ChatGPT can consume significantly more power than standard search engine queries.
The IEA also said that increased power consumption by data centers, cryptocurrencies and AI from 2022 to 2026 could be on par with Sweden or Germany’s electricity consumption. This highlights the scale of AI energy consumption.
Additionally, the report projects a significant increase in data center energy consumption over the coming years, driven primarily by the proliferation of AI.
For example, McKinsey & Company expects that electricity demand for US data centers will reach 606 terawatt hours (TWH) by 2030 by 2030. Increases from 147 TWH in 2023.
To address this challenge, TUM researchers have developed innovative training methods that are 100 times faster, while maintaining accuracy comparable to existing methods.
This breakthrough could significantly reduce AI energy consumption, making adoption of large-scale AI more sustainable.
Understanding neural networks
AI systems rely on artificial neural networks inspired by the human brain. These networks are made up of interconnect nodes (artificial neurons) that process the input signals.
Each connection is weighted with a specific parameter, and a signal is passed forward when the input exceeds a threshold.
Training neural networks involves repeated adjustments to improve prediction. However, this process is computationally expensive and contributes to high power usage.
More efficient training methods
Felix Dietrich, a professor specializing in physics-enhanced machine learning, and his research team have introduced an innovative approach to neural network training.
Instead of relying on traditional iterative methods, the method employs probabilistic parameter selection.
This method focuses on identifying key points in training data and strategically assigning values based on probability distributions when rapid and significant changes occur.
By targeting the main locations within the dataset, this approach dramatically reduces the number of required iterations, leading to significant energy savings.
Real World Applications
This new training technique has great potential for a wide range of applications. Energy-efficient AI models can be used in climate modeling, financial market analysis, and other dynamic systems that require rapid data processing.
By reducing the energy footprint of AI training, this approach not only reduces operational costs, but also aligns AI development with global sustainability goals.
The future of environmentally friendly AI
The rapid expansion of AI applications requires a sustainable approach to energy consumption.
As data centers are expected to use more energy-efficient training methods, it is important to adopt. The breakthroughs by the TUM team illustrate key steps to making AI greener without compromising performance.
As technology evolves, such innovations will play a pivotal role in shaping a more sustainable digital future.
Source link