RUS | ENG | All
Enter the email or login, that you used for registration.
If you do not remember your password, simply leave this field blank and you will receive a new, along with a link to activate.

Not registered yet?
Welcome!

2024-08-06 06:35:00

AI NEEDS GIANT ELECTRICITY

AI NEEDS GIANT ELECTRICITY

 

By Aurel Stratan  editor, Rudeana SRL-D

ENERGYCENTRALAug 1, 2024 - Annual electricity consumption by artificial intelligence (AI) is surging at an astonishing rate and by 2027 it will compare to that of Argentina, the Netherlands, or Sweden.

The lead author, data scientist Alex de Vries at Vrije Universiteit Amsterdam in the Netherlands, predicts that within four years the AI server farms of giants like OpenAI, Google or Meta could consume anywhere from 85 to 134 terawatt hours of energy per year.

This would account for 0.5% of the entire global energy demand and would leave a deep carbon footprint in the planet’s environment.

This situation is reminiscent of the much-criticized power consumption of the crypto industry in recent years.

How consumption was measured

AI chatbots such as ChatGPT and Bard are known as voracious consumers of electricity and water. More precisely, it is the colossal data centers powering them that are to blame and there are no signs of slowing down.

Determining the exact energy consumption of AI companies like OpenAI is challenging due to their secrecy about these figures. De Vries estimated their energy usage by examining the sales of Nvidia A100 servers, which make up an estimated 95 percent of the infrastructure that supports the AI industry.

AI encompasses a variety of technologies and methods that enable machines to exhibit intelligent behavior. Within this realm, generative AI is used to create new content such as text and images. These tools utilize natural language processing and share a common process: initial training followed by inference.

The training phase of AI models, often considered the most energy-intensive, has been a focal point in sustainability research. During this phase, AI models are fed extensive datasets, adjusting their parameters to align predicted output with target output.

For instance, Hugging Face's BLOOM model consumed 433 MWh of electricity during training. Other large language models like GPT-3, Gopher, and OPT reportedly used 1,287, 1,066, and 324 MWh for training, respectively, due to their large datasets and numerous parameters.

But after training, these models enter the less-studied inference phase, which generate outputs based on new data. For a tool like ChatGPT, this phase involves creating live responses to user queries.

According to Google, 60% of AI-related energy consumption between 2019 and 2021 stemmed from inference, and this fact attracted the Dutch researcher’s attention about its costs compared to training.

The inference phase is relatively less explored in the context of AI's environmental sustainability. However, indications suggest that its energy demand can be significantly higher than the training phase, as OpenAI required an estimated 564 MWh per day for ChatGPT's operation.

The energy consumption ratio between these phases remains an open question, requiring further examination.

Future energy footprint

The AI boom in 2023 has led to an increased demand for AI chips. NVIDIA, a chip manufacturer, reported a record AI-driven revenue of 13.5 billion dollars in the second quarter of 2023. The 141% increase in the company's data center segment underscores the growing demand for AI products, potentially leading to a significant increase in AI's energy footprint.

If generative AI like ChatGPT were integrated into every Google search, it could lead to substantial power demand, as per research, whose author suggest that such a scenario could require more than half a million of NVIDIA’s A100HGX servers (4.1 million CPUs), resulting in massive electricity consumption.

"At a power demand of 6.5 kW per server, this would translate into a daily electricity consumption of 80 GWh and an annual consumption of 29.2 TWh. New Street Research independently arrived at similar estimates, suggesting that Google would need approximately 400,000 servers, which would lead to a daily consumption of 62.4 GWh and an annual consumption of 22.8 TWh. With Google currently processing up to 9 billion searches daily, these scenarios would average to an energy consumption of 6.9-8.9 Wh per request," the Dutch scientist calculated.

Obligating AI companies to report their electricity and water consumption might be the first step to address the situation. California governor Gavin Newsom, for example, signed this October two major climate disclosure laws, forcing around 10,000 AI companies – including IT giants like OpenAI and Alphabet – to disclose how much carbon they produce by 2026. 

-----

This thought leadership article was originally shared with Energy Central's Load Management Community Group. The communities are a place where professionals in the power industry can share, learn and connect in a collaborative environment. Join the Load Management group today and learn from others who work in the industry.

-----


Earlier:

AI NEEDS GIANT ELECTRICITY
2024, July, 3, 06:30:00
GLOBAL ENERGY TRANSFORMATION
Digital infrastructure is being deployed across various layers of the power grid, from transmission to distribution, enabling better management and optimization of electrical systems.
AI NEEDS GIANT ELECTRICITY
2024, July, 3, 06:10:00
AI AS A THREAT
This starts by asking about who owns AI and how it might be employed in everyone’s best interest. The future belongs to us all.
AI NEEDS GIANT ELECTRICITY
2024, June, 26, 06:35:00
LATEST ENERGY TRENDS
What are the latest trends in the power and utilities sector?
AI NEEDS GIANT ELECTRICITY
2024, May, 14, 06:45:00
DIGITAL ENERGY DEMAND GROWTH
As digital technologies evolve, so too does their appetite for energy. This correlation is particularly evident in the realm of data centers, which are increasingly burdened by the demands of artificial intelligence (AI).
AI NEEDS GIANT ELECTRICITY
2024, May, 3, 06:25:00
DIGITAL TWINS & AI
At its core, a Digital Twin is not merely a simulation but a dynamic digital replica intricately entwined with real-world phenomena.
AI NEEDS GIANT ELECTRICITY
2024, April, 17, 06:40:00
POSSIBILITIES FOR GRID INVESTMENT
As solar and storage costs continue to decrease, at what point will having stand-alone power for customers of all types be a reality? What about microgrids that can use fuel cells and emerging green or even blue hydrogen technologies.
AI NEEDS GIANT ELECTRICITY
2023, November, 30, 06:25:00
ARTIFICIAL INTELLIGENCE FOR ENERGY SYSTEMS
Next generation AI solutions are quickly making their way from the test labs into utilities. These new tools will make it simpler for mobile service teams and remote employees to complete their work, boosting productivity and improving energy company efficiency.
All Publications »
Tags: AI, ELECTRICITY