Alan Greenshields, Director of Europe at ESS Inc., explains why long-duration energy storage (LDES) could be key in enabling the AI revolution.
2023 was a monumental year for artificial intelligence (AI). However, less remarked upon is the significant energy that will be required to enable this new technology. Deloitte recently found that over a quarter of UK adults had already used generative AI technologies.
AI technologies are demonstrating astonishing performance, but with astonishing energy needs which are often overlooked. While the human brain is incredibly energy efficient with a consumption of 20 Watts that is akin to a light bulb, AI chips are extremely power hungry in comparison.
Gartner estimates that over 80% of enterprises will consistently use AI by 2026 and AI demand for energy could lead to a world where data processing consumes over 20% of global energy supply. For context, the most popular AI chip currently, Nvidia’s A100, requires around 400 W of power for each instance and as technology advances, so do the power requirements. The top-performing Nvidia H100 AI chip consumes 700 W of power, which is equivalent to a standard microwave oven – and tens of thousands of these will be in use in the UK in the next few years.
Amidst the climate crisis, the crucial debate is how to sustainably meet the energy needs of rapidly growing AI data centres with carbon-free power.
Addressing energy needs
AI’s forward march is unlikely to slow. To give more context on coming demand, at present, global data centres consume approximately 1 to 2% of the world’s electricity, excluding Bitcoin. According to the International Energy Agency (IEA), today, data centres alone account for over 1% of global electricity usage, with an additional 1.14% used in data transmission.
The data centre industry is expected to grow at a compound annual growth rate of 4.95%, with some estimates suggesting that annual electricity demand for information and communication technology could grow to as much as 8,000 TWh by 2030, equating to 20.9% of projected global electric demand.
In addition to direct energy consumption, the cooling systems required to enable server operations are responsible for approximately 35% of a data centre’s carbon emissions. The power used to support these cooling systems alongside the day-to-day running of data centres has meant they now contribute more to global carbon dioxide levels than the aviation industry.
Effects of growing data centre deployment
The increased deployment of data centres to meet AI demand has major potential implications for the climate.
Fortunately, major data centre providers are already committing to carbon neutrality and clean energy. For example, Google Cloud has committed to carbon-free operations by 2030, while Digital Reality has a global renewable coverage of 62% across its data centres.
Others will need to follow suit if an AI-driven climate crisis is to be avoided. New clean energy technologies are now available that allow AI data centres to be powered by clean wind and solar energy 24/7, eliminating the potential carbon impacts of this sector while providing resilient, reliable power.
Energy storage as the stabiliser
Solar and wind are not only the cleanest, but are now the cheapest forms of new generation capacity. However, their inherent intermittency poses challenges for facilities such as data centres that require 24/7 operation.
New long-duration energy storage (LDES) technologies have the capacity to store up to 12 hours of electricity to be dispatched when needed to provide consistent, reliable power to data centre applications even when the sun isn’t shining or wind isn’t blowing.
Mass deployment of LDES could ensure consistent supply of green energy to power data centres and reduce the impact the increase of electricity consumption will have on the grid.
That new technologies exist to solve the problem is the good news. However, existing regulations and bureaucracy are impeding the rapid deployment of the clean energy technologies needed to ensure that clean energy will power the AI demands of tomorrow. For example, the UK grid is facing delays of up to a 15-year wait for grid connectivity, presenting significant headwinds to new projects and technologies.
Need for new grid architectures
While we wait for new generation to come online, data centre energy demand is already having an impact on legacy energy systems, with global implications. Last year, the boss of a Norwegian arms company blamed the storage of cat videos for having an adverse effect on his organisation’s production of munitions for Ukraine. The TikTok data centre located near the munitions factory was drawing vast quantities of electricity from the grid directly impacting on the factory’s production.
While grid-scale projects may take years to come online, another alternative is taking shape. The electricity grid, which historically has relied upon large central generation sources and the constant balancing of supply and demand, has changed little since its inauguration over 100 years ago. With new energy storage technologies come options to design new architectures – such as microgrids, that can balance the unpredictability of renewables with the stability needs of AI.
Microgrids powering data centres would both buffer local grids from increased demand while ensuring reliable, resilient power regardless of surrounding grid conditions.
The silver lining to growing energy demand for AI data centres is their potential to drive rapid innovation and scale for new clean energy technologies. With demand from data centres for microgrids with LDES, data centres could quickly change from a climate liability into a climate opportunity, accelerating not only the advancement of computing technology, but the deployment of clean energy.