The Signal May 28, 2024

Decarbonizing the AI Revolution 

The vast and increasing amount of electricity required to power the AI revolution is directly at odds with the need to decarbonize. Solving this dichotomy requires supply chain organizations, as major purchasers of both power-generation infrastructure and cloud computing services, to play their part.

Geraint John Avatar
Geraint John
Sustainability

It’s not so much the elephant in the room as the monster in the server farm. 

The vast and exponentially increasing amount of electricity required to power AI applications is directly at odds with humanity’s struggle to decarbonize at the pace required to prevent global temperatures from rising 1.5°C above pre-industrial levels.  

Solving this apparent dichotomy is key to the AI revolution’s sustainability and goes right to heart of the Zero100 Community’s mission of 0% Carbon, 100% Digital. The good news is that it can be done, but it’s going to take a ton of innovation and supply chains to play their part. 

Big Tech, Big Energy Growth 

Big Tech is currently investing tens of billions of dollars a year in new data centers (server farms) to run generative and predictive AI applications. They, along with the rest of the world’s 8,000 data centers, consume huge amounts of energy, with The International Energy Agency (IEA) forecasting that the global demand for data center electricity could more than double, from around 460 terawatt hours (TWh) in 2022 to over 1,000 TWh in 2026 – roughly equivalent to Japan’s annual consumption. 

And in the US, home to one-third of all data centers, the sector’s share of electricity usage could hit 6% by 2026, higher than in the European Union or China.  

Plus, demand is set to rise tenfold during this period due to more energy-intensive AI tools, such as ChatGPT, Gemini, and Alexa. 

Bar chart showing increase in data center electricity consumption by region/countru (US, EU, and China) in 2022 and 2026 (estimate). 
Source: International Energy Agency

Carbon Emissions Are Set to Double 

As much as Alphabet, Amazon Web Services (AWS), Microsoft, and other providers want to run AI data centers on renewable energy, the reality is that there simply isn’t enough solar, wind, or nuclear capacity currently available in the US or Europe to supply them. In the US, around 60% of increased data center power needs will have to be met by burning fossil fuels, primarily natural gas. 

Building these new “digital factories” also comes with a hefty carbon footprint. Earlier this month, Microsoft reported a 31% jump in its Scope 3 carbon emissions between 2020 and 2023, largely due to the energy involved in manufacturing data center equipment and constructing the massive facilities to house it. 

The net result? CO2 emissions from data centers – estimated to be at least 2.5% of the global share today (higher than commercial aviation or shipping) – could more than double by 2030. 

Bar chart showing data center emissions by sector. 
Source: Climatiq analysis

A Revolution in Sustainable Power 

Fully decarbonizing global electricity generation would require annual grid investment to also double, from about $300 billion in 2022 to $600 billion by 2030. That’s more CapEx spending than traditional energy firms can afford. 

So, how can the AI revolution be more sustainably powered? The answer lies in a mix of cross-industry innovation, behavioral changes, and supply chain pressure. In practice, this looks like: 

  • Direct Investment: Big Tech is increasingly funding clean energy itself. Microsoft recently announced a deal to build 10.5GW of renewable capacity by 2030; earlier this year, AWS bought a nuclear reactor in Pennsylvania; and Google and OpenAI are working with startups to tap geothermal, nuclear-fusion, and solar power sources.    
  • Hardware Innovation: Replacing power cables with more efficient materials, advanced battery storage systems, and sensors that help to optimize power distribution are among improvements designed to make existing transmission networks operate smarter.   
  • Software Innovation: Back in 2020, Google CEO Sundar Pichai touted AI itself as a way to reduce the amount of energy needed to cool its data centers and combat climate change. Examples include analyzing historical grid data and using digital twins to identify energy bottlenecks.  
  • Behavioral Changes: Persuading companies to cut the amount of unused “dark data” they store in the cloud is one way to reduce carbon emissions. Improving software development processes – so-called “green coding” – is another.  

As major purchasers of both power-generation infrastructure and cloud computing services, supply chain organizations are especially well-positioned to help accelerate digital decarbonization efforts. Gathering data on Scope 3 emissions using tools such as AWS’s carbon footprint calculator to understand their environmental impact is a great place to start.   

Back in the nineteenth century, as the Industrial Revolution took hold, coal became the primary source of power for newly built factories. It has taken the world decades to wean itself off this highly polluting fuel – and the job is by no means finished. Action to make the AI revolution more sustainable needs to happen much faster.