TED Talks Daily - How AI can solve its own energy crisis | Varun Sivaram

Episode Date: November 3, 2025

The AI revolution and our aging power grid are on a historic collision course, threatening to stall innovation and raise energy costs for everyone. Physicist and AI grid futurist Varun Sivaram reveals... how we might turn this looming crisis into a once-in-a-generation opportunity — unlocking massive power capacity, lowering costs and accelerating the energy future we’ve been waiting for.Interested in learning more about upcoming TED events? Follow these links:TEDNext: ted.com/futureyou Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:00 You're listening to TED Talks Daily, where we bring you new ideas to spark your curiosity every day. I'm your host, Ily's Hugh. The need for AI computing power is growing at an exponential rate, and while many are excited about the AI revolution, we're also facing a terrible truth. The world's resources are being taxed at an unprecedented rate by the soaring energy needs of AI technologies. In this talk, grid futurist Varun Siveram shares his work, developing flexible AI data centers, how they could actually help power our energy grid, and how to support the AI boom, but responsibly. On a blistering hot day in Phoenix, Arizona, as a million air conditioners drove up demand on the power grid, a cluster of energy-hungry artificial intelligence servers bucked the trend.
Starting point is 00:01:03 They actually helped. For three hours, these AI computers at an Oracle data center dropped their power consumption by 25% to provide perfectly timed relief during that day's peak demand. And, critically, the advanced Nvidia chips continued to meet the stringent performance requirements of their tasks. training, fine-tuning, and using AI large language models. Our team at Emerald AI orchestrated this first-of-a-kind demonstration of flexible AI computing.
Starting point is 00:01:40 And we're not alone. Google's also made impressive strides. Scaling up our technologies across the country and around the world could help solve one of the biggest challenges of our time, powering the AI revolution. while also advancing a more reliable, affordable, and clean power grid. Far from undermining it, AI could actually help save the grid. To understand why, we need to reimagine the challenge of powering AI.
Starting point is 00:02:15 And so I've reinvented my own career. For 15 years as an energy executive and as America's lead clean energy diplomat, I focused on building more clean energy. But energy supply is just half the equation. And so I founded Emerald AI to focus on the other half, demand, helping AI intelligently use energy, support grids, and unlock massive, stranded power capacity that already exists. Without this capability, we face an impending crisis,
Starting point is 00:02:50 a historic collision between two multi-trillion-dollar networks. The network of AI data centers, it's rapidly growing and an aging electricity grid utterly unprepared for all this new demand. That's bad news, folks, for multiple reasons. First, America risks falling behind in AI. In Virginia, the data center capital of the world, it takes up to seven years to connect new data centers to the grid. Second, power prices are soaring for communities.
Starting point is 00:03:24 Just in 2025, as we built new grids and new power plants, data center demand drove up the average annual household power price in Columbus, Ohio, by $240. This is just a beginning. As data center surge from 4% of U.S. power demand today to 12% by 2030, it's like adding another Germany to the U.S. power grid. And third, fossil fuels are set to power the boom in AI data centers,
Starting point is 00:03:55 which require reliable power supplies today. In the United States, natural gas is powering most AI growth, and countries like India will see rising coal use increasing global carbon emissions. But it doesn't have to be this way. The biggest new user of electricity could actually be our grid's greatest ally. The key lies in something deceptively simple, flexibility. That's distinct from efficiency or using less energy overall.
Starting point is 00:04:29 Rather, if AI were just a little more flexible in when it uses energy, it could consume vast amounts of otherwise stranded power on today's grids. Think of our electric power system as a superhighway that faces peak rush hour just a few hours per month. that hottest day of the summer in Phoenix, Arizona, when air conditioning demand peaks. On those days, grid's risk being overwhelmed by these massive new data centers
Starting point is 00:04:59 that may soon consume more than a gigawatt or more energy than the state of Vermont consumes. But most of the time, power plants are running well below their full capacity and transmission lines are carrying less power than they could, just like that highway. On average, throughout the year, half of the power system's capacity goes unused.
Starting point is 00:05:21 What if during those peak rush hour periods, when the grid is truly stressed, AI data centers could dynamically reduce their power consumption and take advantage otherwise of all that spare capacity throughout the year? It would be like briefly taking 18-wheeler's off of that road to let the remaining traffic flow smoothly. Well, it turns out that if AI data centers were just modest, flexible, just less than 2% of the year, trimming demand by a quarter, just a couple hours
Starting point is 00:05:52 at a time, America could fit up to 100 gigawatts of new data centers on existing power grids across the country. That's $4 trillion of AI investment unlocked today without waiting years for new infrastructure. Now, to be sure, America will need even more energy to power. our growing economy, as data centers, factories, and other users of electricity join. But by making AI data centers flexible, we can prudently expand our grid and buy ourselves time to build clean nuclear or geothermal power plants. And what's more? With flexible AI data centers acting as giant shock absorbers on the grid, we can integrate intermittent but cheap
Starting point is 00:06:45 solar and wind power, driving down the cost of energy for AI. So, that's what I do. My team and I are building the software brain to give AI data centers this crucial flexibility. It's an AI for AI. We call it the Emerald Conductor. It works by harnessing something we call spatiotemporal flexibility.
Starting point is 00:07:12 That's a fancy term for a simple idea. Let's break it down. First, temporal flexibility. Not all AI jobs are created equal. Some workloads, like training or fine-tuning an AI model, conducting deep research or running a massive scientific simulation are what we call batchable. They're incredibly important,
Starting point is 00:07:36 but they don't have to be completed right the second. Software can intelligently pause or slow these workloads briefly, when the grid is stressed, and then speed them back up when there's plenty of power available. Then there's spatial flexibility. Think of your query to a generative AI chatbot. You can't pause the job of responding to that query,
Starting point is 00:08:01 but you can move it across the country at the speed of light. So even as we struggle to build electric power transmission, we can take advantage of virtual transmission or the network of fiber optic cables that criss-crosses the country and the planet to move AI workloads from a data center in a city where the grid's currently strained, let's say Phoenix on a hot day,
Starting point is 00:08:24 to a data center in a region where there's presently abundant power, say the wind-swept great planes. The AI workloads get done, but the grid gets a break right when it needs it most. And the user never even notices because behind the scenes, there's an AI orchestrating AI. Data centers become smart, cooperative partners to the power grid. And we know it works.
Starting point is 00:08:48 Remember that demo I told you about? It happened. In May 2025 in Phoenix, Arizona, we took a cluster of 256 GPU servers, and we ran a mix of AI workloads. Some highly flexible, others entirely inflexible, and many in between. One hot afternoon, our software received a signal that the local utility was going to reach its peak demand. And so Emerald Conductor gracefully reduced the AI computational power load by 25% for the exact three hours requested by the grid.
Starting point is 00:09:22 We proved that AI data centers can flex when the grid is tight and sprint when users need them to. But proving the technology was just the first step. The hardest part will be to convince the enormous energy and AI industries to cooperate and to change the way that they operate. For over a century, electric power utilities have assumed that their users can't simply reduce their power consumption when the grid faces peak rush hour.
Starting point is 00:09:55 Sure, in limited situations, a utility may request homes to adjust their thermostats or large industrial loads to dial down consumption. But these interventions are typically tiny and marginal. But AI data centers are fundamental. fundamentally different, with a transformative potential to be flexible. They're massive energy users compared with tiny household loads that need to be aggregated. They respond faster and more gracefully than large manufacturing facilities,
Starting point is 00:10:26 and they can move their workloads around the country at the speed of light, which no other energy user can do. That's why I'm so excited about initiatives that bring together the energy and technology industries, like EPRI's DC Flex. In upcoming demonstrations in the United States and with National Grid in the United Kingdom, Emerald will showcase how AI workloads can flex and move across regions
Starting point is 00:10:56 and will prove that software-like conductor can orchestrate a symphony of AI workloads in concert with on-site energy equipment like batteries to deliver even more flexibility to power grids. And with our partner NVIDIA, we're building a reference design for next-generation data centers or AI factories to be power flexible so that utilities that see the certification
Starting point is 00:11:23 can more swiftly connect a grid-friendly AI factory. So where does this all leave us? Well, it means rather than wait years for grid upgrades, we can build all the AI infrastructure we need right now to sharpen our competitive edge. And far from crashing the grid, flexible AI data centers can provide relief before the grid hits a breaking point,
Starting point is 00:11:48 avoiding rolling blackouts. Rather than increasing power prices, they could actually go down. As flexible AI data centers more effectively utilize the existing energy infrastructure, deferring expensive upgrades. And rather, rather than goose demand only for fossil fuels,
Starting point is 00:12:11 AI's soaring energy needs could encourage more clean energy onto the grid at home and abroad. Solar today is the cheapest, fastest-growing power source on the planet. Imagine flexible AI data centers capable of ramping their energy consumption to match daytime solar peaks, or shifting their loads so that they better integrate clean energy onto the grid. The AI revolution is here.
Starting point is 00:12:43 And I believe we can have it all. Break-neck innovation. Massive investments in AI. An abundant, affordable, reliable, and clean energy for all. An AI for flexible AI infrastructure could be a linchpin for our future energy system. Thank you. That was Varroon Sivaram at a TED countdown event in New York
Starting point is 00:13:17 in partnership with the Bezos Earth Fund in 2025. If you're curious about Ted's curation, find out more at TED.com slash curation guidelines. And that's it for today. Ted Talks Daily is part of the TED Audio Collective. This talk was fact-checked by the TED Research Team and produced and edited by our team, Martha Estefanos,
Starting point is 00:13:38 Oliver Friedman, Brian Green, Lucy Little, and Tonica, Song Marnivong. This episode was mixed by Christopher Faisi Bogan. Additional support from Emma Tobner and Daniela Balerozzo. I'm Elise Hugh. I'll be back tomorrow with a fresh idea for your feed. Thanks for listening. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.