TED Talks Daily - The story you're not hearing about AI data centers | Ayșe Coskun

Episode Date: February 25, 2026

The race to build smarter AI is crashing into a physical limitation: the power grid simply can't keep up with the energy demands of data centers. Computer scientist Ayșe Coskun shows how we could tur...n this problem on its head, transforming AI facilities into virtual batteries that help stabilize the grid and accelerate clean energy. Learn why the technology causing this crisis might be the only thing smart enough to fix it.Learn more about our flagship conference happening this April at attend.ted.com/podcast Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:06 You're listening to TED Talks Daily where we bring you new ideas and conversations to spark your curiosity every day. I'm your host, Elise Hume. AI data centers are often blamed for straining power grids, heavy water usage, and driving up electricity demand. This may be one of the biggest issues we face when it comes to AI. But computer scientist Aisha Josh Kuhn says instead of competing with AI data centers over invaluable natural resources, we should be asking a different question. How can we leverage AI to actually help stabilize the grid? In this talk, Aisha shares how to turn the facilities seen as energy hogs into flexible grid-supporting assets,
Starting point is 00:00:49 a move that could prevent blackouts, lower costs, and even accelerate the adoption of clean energy. She shares why a sustainable future for AI exists if we're bold enough to try. Right now, the world isn't an AI race. companies, governments, universities are all racing to build bigger models, smarter systems, and behind the scenes, they are racing to build more data centers to power AI.
Starting point is 00:01:24 But there's a problem. We are running headfirst into the limits of our infrastructure. The power grid includes all the infrastructure, power plans, transmission lines and all, to generate and deliver power for our homes, our businesses, and now to AI data centers. In the United States, the grid operators are reporting that new AI data center projects are requesting power loads
Starting point is 00:01:50 equal to entire cities. In some regions, utilities simply can't keep up. So when you hear AI data center, what comes to mind? For many, it's one thing. Energy hogs. And they are not wrong. AI is dramatically accelerating the electricity demand of data centers. Just training GPT4 is estimated to have consumed around the annual electricity use of thousands
Starting point is 00:02:22 of US homes. In another striking example, in Ireland, nearly 20% of the nation's electricity is drawn by data centers today. And these are not just statistics. There are also community stories. In the data center alley in Virginia, residents recently saw higher electricity bills, 20% higher already compared to just a few years ago, as utilities scramble to serve massive new AI facilities.
Starting point is 00:02:57 So, energy hog label seems well deserved, but that's only half the story. Here is the new view. These facilities are not just energy-hungry brains, they can also be the muscles of the grid, flexing on demand. Unlike our homes or hospitals, AI data centers run jobs that are predictable, controllable, and often delayable. That makes them ideal to health balance supply and demand on the grid. By making AI data centers power flexible, we can connect.
Starting point is 00:03:37 make them much more rapidly to the grid, while at the same time making electricity more affordable and resilient. What's more? The AI boom is arriving, just as the renewable boom is also taking off. Wind and solar don't follow our schedules, but data centers can, which means we can align the rise of AI with the rise of clean energy, if we are bold enough to rethink their role.
Starting point is 00:04:11 All this transformation to power flexibility didn't just come out of thin air. It builds on decades of research, on energy-efficient computing, scheduling, optimization, and many others. I have lived this journey myself. Early in my career, I asked a question that many found unrealistic. Could computer systems adapt their behavior depending on power grid needs,
Starting point is 00:04:46 but without breaking their performance promise to their users? At the time, this sounded radical, because why would we ever design a system that would slow itself down on purpose? But then came the breakthroughs. First, we discovered not all computing tasks are urgent. Some can wait for minutes or hours, and some can be slowed down without anyone really noticing it.
Starting point is 00:05:18 For example, a researcher analyzing hundreds of medical images with AI. Maybe okay with waiting just a little longer. Or if you are fine-tuning your AI model over the course of the next few days, days, you may be okay with slowing it down for just a few hours. This inherent flexibility in computing gives us the flexibility we need to manage power. Second, we reframe the problem. Instead of asking, how do we compute as fast as possible, we ask, how do we make computer systems meet the constraints of the power grid, while at the same time still deliver
Starting point is 00:06:03 on user performance agreements. This shift led to new strategies, capping power, shifting workloads, and provisioning the data center as a flexible reserve to the grid. A key aspect here is that
Starting point is 00:06:19 we do keep the performance promise to users, so it's not arbitrary. User experience remains as a key target, and better yet, it becomes more predictable. So we built prototypes on real data center servers, and they worked systems that could follow a power target while still delivering results.
Starting point is 00:06:43 But all this journey wasn't smooth. There were paper rejections, bonding rejections, colleagues telling me this would never work. Well, since I was a kid, I was told I'm a persistent person, perhaps stubborn at times. And bold ideas require persistence, because change almost always looks impossible before it looks obvious. So you take that feedback, you reframe it again and again, and you keep building, you keep proving. So what began as scribbles on a whiteboard 12 years ago is now running on real AI data centers.
Starting point is 00:07:27 Why does this matter now? because the power grid's challenge isn't just to generate more power. It's about timing. Solar gives us a glut of electricity at noon, but demand might peak in the evening. Wind might be abundant one day and scarce the next. Nuclear takes decades and billions of dollars to build and is often hard to locate in urban areas.
Starting point is 00:07:57 batteries are critical, but scaling them is costly, slow, and often not environmentally clean. Meanwhile, AI data centers themselves face five to seven-year wait times just to connect to the grid in places like Virginia. In AI time, where technologies shift in a major way every six months, five to seven years is an eternity. So here's the opportunity. With the right orchestration, AI data centers can be flexible today. No waiting, no new massive power infrastructure construction. They can soak up excess solar in the afternoon, scale down at peak times,
Starting point is 00:08:43 and act as virtual batteries today. And the stakes are real. Take Texas August 23. During a brutal heat wave, the rising electricity demand push the grid to its limits. wholesale electricity prices spiked over 800% in a single afternoon. So flexible loads, if they were widely available, could have reduced the costs
Starting point is 00:09:11 and could have prevented the emergency alerts that went to the consumers. So we have two opportunities here. One, we can make current data centers flexible and help prevent blackouts and reduce electricity costs. Two, and perhaps the more significant, by making future data centers powerful, we can connect them much earlier without waiting for major power grid upgrades. If we ignore this opportunity, we are not just wasting renewable energy, and we are not just raising our electricity bills. We are also slowing AI adoption, making it delayed, more expensive,
Starting point is 00:09:55 and less accessible to society. But there's a catch. Orchestrating this flexibility is not easy. Prices change hourly. Workloads may arrive unpredictably. Grid rules change across states, across countries, so no human operator and no single fixed data center management policy can keep up. This is where AI itself comes back into the story.
Starting point is 00:10:24 back into the story. The very technology driving this unforeseen demand is also probably the only thing smart enough to tame it. AI can learn patterns, anticipate grid needs, and coordinate across data centers, across utilities, even nations in real time. Imagine a data center or a whole network of them as an orchestra, with hundreds of instruments all play at once. Left on their own, it can sound like chaos. But bringing a conductor, suddenly all that noise turns into music. The conductor, in this case, is AI. AI can direct data center operation so that the data center can precisely match power constraints, depending on what the grid needs, what power is available, and what users demand. The result is harmony, reliable electricity,
Starting point is 00:11:29 efficient computing, and a system that works beautifully together. And that's exactly what we've built. We built software that slows down, speeds up, or poses workloads in a data center, or shifts workload among data centers. Our conductor platform tunes performance, tunes performance, and power at real time, all the while respecting user and cloud provider performance needs. In this way, by flexing when needed, we can connect AI data centers much faster to the grid, make better use of the available power in the power grid, and enable faster AI adoption. I've been inside this story from an idea that once seemed impossible to prototype to systems now running in the field.
Starting point is 00:12:22 And I believe this is just the beginning. AI is already reshaping how we compute, but it could also reshape how we power the world. So the question isn't how much energy AI consumes. The real question is how much flexibility, resilience, and clean power can AI unlock? If we are bold enough to rethink AI data centers, the very machines that now seem like a bird
Starting point is 00:12:49 burden could be our greatest assets in building a sustainable AI future. Thanks. That was Aisha Josh Coon, speaking at TED AI in San Francisco in California in 2025. If you're curious about TED's curation, find out more at TED.com slash curation guidelines. And that's it for today. Ted Talks Daily is part of the TED Audio Collective. This talk was fact-checked by the TED Research Team and produced an edit. by our team, Martha Estefanos, Oliver Friedman, Brian Green, Lucy Little,
Starting point is 00:13:31 and Tonzica, Sung Marnivang. This episode was mixed by Christopher Faisi Bogan. Additional support from Emma Tobner and Daniela Balezzo. I'm Elise Hugh. I'll be back tomorrow with a fresh idea for your feed. Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.