Data Centre

How AI generates both climate pollution and solutions

AI tools help climate scientists do better work, but critics say their massive energy demands are a serious climate threat
Image: Christina Morillo/Pexels

20 March 2024

With billions of users flooding AI services, some energy and tech researchers say the energy demand from AI computing is quickly becoming a major sustainability challenge for the tech industry and the communities that host its vital infrastructure.

When the International Energy Agency issued its latest report on global electricity use in January, the agency flagged the tech industry’s growing energy demand as an area of concern and warned that power consumption by the world’s data centres could double by 2026. The report highlighted some eye-popping figures about the energy consumption by data centres.

In Ireland, the IEA reported, data centers already consume about 17% of all the country’s electricity and are on track to consume as much as a third of the country’s power by 2026. In the US, data centre energy needs – driven by AI, cryptocurrency and other demands on computing power – could rise to nearly 6% of total electricity consumption within a few years, the IEA said, and similar growth in demand to power data is projected in many parts of the world.




“Global data centre consumption could reach 1,000 terawatt hours in 2026,” Eren Çam, IEA energy analyst for electricity, said at a briefing on the report’s release. A terawatt, roughly speaking, is enough electricity to power 200,000 average homes. “That would mean that this [data] sector alone would be then consuming in a year as much electricity as is consumed in Japan currently,” Çam said.

Much of that increased energy demand is driven by the industry’s adoption of AI, which requires greater computing power from beefier microchips provided by chipmakers such as Nvidia.

“We certainly need to be thinking about this through the lens of sustainability, no question about that,” Equinix Vice President of Global Sustainability Christopher Wellise told Newsweek. Equinix is one of the world’s biggest data infrastructure providers and works with about half of the Fortune 500 companies.

Wellise said AI’s energy use is just one of many growing strains on electric grids. “It’s kind of the perfect storm of demand for electrons,” he said.

There’s a mantra among clean energy and climate activists: “Electrify everything.” Electricity from renewable energy can drive fossil fuel consumption out of the daily use of transportation, home heating and cooling and appliances. But that approach only works if the grid is truly greening, and exploding energy demand could undermine efforts to decarbonize our electricity sources.

Max Schulze directs the nonprofit Sustainable Digital Infrastructure Alliance. He fears the move to electrify everything is on a collision course with the energy demand from AI.

“The question is, how on Earth are we going to mobilise this much energy while at the same time we’re electrifying our cars, we’re electrifying heating, we’re electrifying everything,” Schulze told Newsweek. “Everybody wants energy, especially green electricity.”

AI holds the promise to help climate scientists and energy engineers figure out ways to meet the climate challenge. But it could also hinder the clean energy transition by adding to the demand for energy at a time when scientists say emissions from our energy consumption must come down fast.

Why AI is so power hungry

AI’s power surge begins with the processing units in the servers that make up AI systems.

Wellise at Equinix said that while standard CPUs use from 65 to 85 watts of power, the graphics processing units (GPUs) used in AI can consume 200 to 500 watts, and newer models need even more. And then there is the extra energy required to cool the units.

AI systems have two distinct phases of development and use, researchers say, each with its own energy requirements. The first is the training of the AI model, and the energy required varies greatly depending on the model’s construction.

The second phase is inference, or the model’s operation as it responds to queries. Schulze at the Sustainable Digital Infrastructure Alliance explained that the inference stage of an AI model might need even more energy than the training, simply due to the massive volume of requests.

“The billions of questions being asked to ChatGPT every second, probably at this point, is what’s really driving power consumption at scale,” he said.

Microsoft’s nuclear option

Microsoft is of particular importance to AI’s direction due to its partnership with OpenAI, which has developed popular AI systems such as ChatGPT and the image generator Dall-E. Last year, Microsoft announced a “multiyear, multibillion dollar investment to accelerate AI breakthroughs” with OpenAI.

Bobby Hollis is vice president of energy at Microsoft. A veteran of the intersection of tech and energy, Hollis previously led energy work for Meta and Apple and worked on renewable energy for the Nevada electric utility NV Energy.

Hollis told Newsweek that Microsoft is working to better understand the energy demands and dynamics of the still developing technology and how to make AI operate more efficiently.

“Our goal is to minimise the load and make the energy that we use do as much as possible,” Hollis said.

He is also looking beyond the grid and said Microsoft is interested in ways to provide its own power for data centres, including small modular reactors, or SMRs, for nuclear power.

“We’d love to find opportunities where SMRs could be deployed,” Hollis said. He’s also bullish about what he called the “tried and true renewables” and the long-term chances for nuclear fusion.

Hollis said that when those green power purchases are combined with investments to develop energy storage, such as large-scale batteries, data centers could become strong allies for grid managers looking for ways to add more clean energy to the fuel mix.

He said it is also likely that AI will help to solve some of its own energy problems.

For example, renewable energy can present a problem for grid managers because it is intermittent (solar energy only happens when the sun is shining) and often distributed across many rooftops and solar farms. Managing those variable power inputs and matching them to power demand is a complex, data-dense task – in other words, exactly the sort of thing AI can help with.

Wellise said Equinix has applied AI to create a digital twin of one of its facilities. That allowed the company to better understand the nuances of the energy requirements to cool servers, and they improved efficiency about 9%.

Skeptical climate

Merve Hickok said she doesn’t doubt that AI will have some positive climate impacts. As president and senior research director of the Center for AI and Digital Policy, a nonprofit think tank, she said she’s already seen many examples of AI assisting climate science and clean energy development.

Hickok’s concern, she told Newsweek, is that those benefits could be outweighed by the climate harm from AI’s massive energy demand, and she expressed skepticism about how tech companies will square their AI ambitions with their sustainable energy targets.

“Even though they might be investing in renewable energy, they are still putting extra burden on the existing infrastructure, and until we close that gap, that also means more carbon emissions,” she said.

A report released this month by a coalition of environmental groups including Friends of the Earth and Greenpeace carried a similar message and warned that, without proper regulation, “the great promise of AI technology could result in far greater catastrophe.”

The companies Newsweek spoke with said they remain committed to meeting their ambitious climate targets even as AI adds to their energy consumption.

Hollis at Microsoft said he expects advancements in efficiency to help offset some of AI’s energy appetite, as happened when earlier advancements such as cloud computing triggered a burst in data center use.

“We always knew it would be hard, so I don’t think this necessarily is different,” he said.

Hollis said the AI field is rapidly evolving in real time, so a lot is still unknown, something that Wellise at Equinix also stressed.

Wellise said a lot of AI’s environmental impact will depend on just how widely used AI tools become throughout society.

“That’s one of the things that remains to be seen,” he said. “Will we overuse AI, or will it become sort of highly specialised for the things that it does best?”

News Wires

Read More:

Back to Top ↑