The Current - AI needs a lot of power. So Google’s going nuclear

Episode Date: October 22, 2024

Google has signed a first-of-its-kind deal to fuel its AI data centres with nuclear power, from next-generation small modular reactors that are yet to be built. We look at AI’s enormous appetite for... power — and whether it could drive growth in the alternative energy sector.

Transcript
Discussion (0)
Starting point is 00:00:00 In 2017, it felt like drugs were everywhere in the news, so I started a podcast called On Drugs. We covered a lot of ground over two seasons, but there are still so many more stories to tell. I'm Jeff Turner, and I'm back with Season 3 of On Drugs. And this time, it's going to get personal. I don't know who Sober Jeff is. I don't even know if I like that guy.
Starting point is 00:00:25 On Drugs is available now wherever you get your podcasts. This is a CBC Podcast. Hello, I'm Matt Galloway, and this is The Current Podcast. They're in a race to secure the insatiable power needs of AI without compromising their net zero goals. And now, big tech companies are investing billions in nuclear power. Microsoft was the first ones out of the gates. It's looking to reopen one of the units at Three Mile Island. That is, of course, the infamous site of the worst nuclear accident in U.S. history. Then last week, Google signed a first of its kind deal with a
Starting point is 00:01:02 nuclear power startup to buy electricity from seven small modular reactors that have yet to be built. Two days later, Amazon signed a similar nuclear deal. Jacopo Bongiorno is a nuclear science and engineering professor and director of MIT's Center for Advanced Nuclear Energy Systems. He joins us from Cambridge, Massachusetts. Jacopo, good morning. Good morning. So let's, good morning. Good morning. So let's start with really the basics here. Why are these tech giants turning to nuclear power? Well, let me first say that I don't have a particularly strong or informed opinion about artificial intelligence.
Starting point is 00:01:38 I think like everybody else, I'm reading that it could be useful or dangerous to the extent that we agree that it's useful. My understanding is that it's going to require a lot of electricity to run the data centers that perform the AI algorithms. And in that context, the use of nuclear reactors is almost like an ideal energy source to power these data centers. And that's because nuclear reactors produce electricity 24-7 with high reliability, no carbon emissions, which of course is desirable if we wish to perform these operations without incurring additional greenhouse gas emissions into the atmosphere and aggravate the climate change crisis. And then lastly, they utilize minimal land. And so basically they can be co-located, located near these data centers and reduce also the cost of the transmission and distribution infrastructure. So there are some features of nuclear energy in general and small nuclear reactors in particular that make them, as I said, almost an ideal energy source to power these data centers. source to power these data centers. You know, because the argument when we sort of started to get a sense of just how much power these AI data centers were going to need, everybody was writing how it's simply not realistic for them to keep themselves carbon neutral if they're going to do this. With that in mind, and looking at small modular reactors instead of the bigger, more
Starting point is 00:03:01 traditional nuclear sites, how realistic is this idea that they can get this much energy 24-7 and stay carbon-free? Oh, it's quite realistic. I mean, nuclear reactors of much larger size, to your point, what we use currently on the grid are on the order of a gigawatt, so that's 1,000 megawatts, operate 24-7. So you get that energy all the time, very, very reliable, very reliably. And the small modular reactors actually are not that small, honestly, because there are a few hundred megawatts. So they're smaller than what we're used to, but not that small. And again, it seems that these data centers that perform the AI operations, the AI algorithms, need this sort
Starting point is 00:03:44 of power level, hundreds of megawatts. And so I think, again, small modular erectus quite realistically could provide that. I wrote a piece, this is going back a number of years, but it was looking at like how quickly solar power was growing in terms of capacity. And I quoted a guy back then saying that, you know, if you were to start a new town or a city, there's no way you'd use the traditional grid. Everything would be solar power. And I look at this where they have this enormous appetite for new power and clean energy sources. Why is nuclear the better fit than, say, solar or even hydropower?
Starting point is 00:04:18 Well, my understanding is that they need the power now, right? So it's going to have to be deployed very soon within the next few years. So the three deals that I think you mentioned in the introduction, the one by Microsoft where they're going to bring back online the Three Mile Island reactor is probably the quickest
Starting point is 00:04:36 because yes, there's going to be a cost, there's going to be a time required to bring that reactor online, but you're looking at maybe a couple of years. And then you have, in that case, 800 plus megawatts available right there for your data centers. The other two deals, Google and Amazon, are going to require the deployment of whole new reactors.
Starting point is 00:04:56 So it takes a little bit longer time to license those technologies and then building them. So you're looking realistically at the early 2030s. Is that soon enough? Well, according to the tech companies, it's soon enough. Now, to your question, why this versus solar? I don't think it's honestly an either or. I mean, if you have enough solar capacity that is relatively close to the location of the data centers, you could use solar.
Starting point is 00:05:21 Of course, with solar and wind, the issue is intermittency. centers. You could use solar. Of course, with solar and wind, the issue is intermittency. So you need to back it up with energy storage batteries. And that adds to the cost and adds, of course, to the land usage associated with that power generation. So again, nuclear is, in that sense, almost ideal because it's very compact, it's reliable, it doesn't have the intermittency issue. It is not cheap. So building a new nuclear plant is expensive. The good news is that once you have built it and amortized it,
Starting point is 00:05:51 then the marginal cost operation and maintenance plus fuel are actually pretty low and predictable over decades. So that's another feature that I think the data center owners will appreciate. I want to glom onto a date you mentioned there, that they want to have these up and running by the 2030s, which is really fast. And I know they're already making these in China. These ones that Google is looking at,
Starting point is 00:06:12 the small modular reactors, are slightly different. Is that timeline feasible? That seems really fast. I think it's feasible, and I got good news. You guys are in Canada. The first small modular reactor project in North America actually is taking place now at the Darlington site in Ontario by OPG, Entire Power Generation. So that's going to be the first online at the moment seems to be on schedule to become operational by 2029. So I think if, again, the investment is there and the need is there, the demand is there,
Starting point is 00:06:46 it's absolutely feasible. It might be worth just for a second here taking a step back and explaining, like, Google and Amazon are opting for that technology. You're just talking about small modular reactors. And when I think about that, I have an idea in my mind. But the more I read about it, it feels like it can mean a bunch of things. So can you just walk me through what we mean when we talk about small modular reactors? Yes, I can. Do you have an hour and a half? No, but you know, a couple of minutes. Yes, absolutely. I'm joking. I'm going to give you
Starting point is 00:07:15 the 30 second version of this. You're absolutely correct. Small modular reactors is sort of a catch-all phrase, which is very vague. It simply means that the power output, instead of being 1,000 plus megawatts, is of the order of 100 megawatts. Now, within that smaller range, there is a very, very big number of designs that are being developed by different companies, et cetera. The one that I just mentioned that is being deployed in Ontario actually is a fairly traditional technology in terms of the nuclear fuel that it uses, the structural alloys that it uses, the coolant, which is water, etc., etc. So it's a relatively low risk from a technology point of view. The two that have been chosen by Google and Amazon respectively are a little bit more advanced, a little bit more innovative.
Starting point is 00:08:03 With that comes the promise for perhaps higher efficiency, perhaps even lower cost, although the jury is still very much out on that. But certainly what comes with it is a little bit higher technology risk. So, you know, there are pros and cons. As usual, you decide to take a, you know, a longer step towards innovation and the potential for benefit there is higher, but the risk is also higher. So that's something that the companies got to sort of wait when they make these decisions. So those, the SMRs, the small modular reactors, that's where Amazon and Google are going. Microsoft has this plan to restart Three Mile Island. It is hard to think about that without
Starting point is 00:08:41 thinking about nuclear disasters. What safety concerns should we have about turning it back on? About Three Mile Island, the resurrection now? Virtually none, I would say. So let me give you, again, 30-second history here. So Three Mile Island nuclear power station was designed to have two reactors back in the 70s. One of those two reactors suffered a big accident, as you mentioned in the introduction, that was back in 1979, I believe. And that reactor has never operated again. The one that did not have the accident operated very successfully for 40 years, and then it was
Starting point is 00:09:17 shut down in 2019, mostly for economic reasons. You know, the market in that region of the United States is such that that particular plant was not economically competitive. know, the market in that region of the United States is such that that particular plant was not economically competitive. Now, the company that owns the plant has made a small investment in the past five years to maintain that plant in a good state. You know, the components are not rusty and things of that type. And so now they have to go through, of course, a relicensing effort by the Nuclear Regulatory Commission. And if that process is successful,
Starting point is 00:09:49 if the NRC determines that it's safe to restart it, then they'll be able to restart. But it's a technology that we are very, very familiar with. Like I said, it operated for 40 years safely and reliably before it was shut down for economic reasons. So it's going to take a little bit of time. As I said earlier, a few years, it's going to take a couple billion dollars. So nothing is cheap when you do these things.
Starting point is 00:10:09 But when it's all said and done, it'll come back and it'll be 800, 900 megawatt of clean electricity. So good news for Microsoft. As you say, though, it's not cheap. And it used to be that these huge tech advances that we'd made as a society were often made as a result of investments, usually through the military and through governments. What should we make of this investment coming from tech companies and what changes that may bring for more general consumers down the road? Well, I'm not going to be able to comment on sort of the impact or the implications on the general consumer, but I can tell you two things. First, clearly the tech companies are willing to pay a premium, let's call it above market price, for electricity that is carbon-free and reliable. And that's, I suppose, good news, right? It could go with just generic electricity from the grid, which at the moment is not particularly clean because it comes in part from coal and natural gas, as well as nuclear and solar and wind.
Starting point is 00:11:13 But you get what you get on the grid. Instead, they decided to make a bigger investment and buy directly electricity that is carbon free from nuclear reactors. I think all of that is great. It's great for the environment. And it shows that they're committed to reducing their carbon footprint. It's also honestly fantastic for the nuclear technology development companies because these first-of-a-kind reactors are not going to be cheap, and they need a customer that is willing to pay that premium.
Starting point is 00:11:43 And then as they build more and they learn about their technology, costs will come down and then they can start to penetrate markets where the expected prices are a little bit lower. So I think it's a win-win situation. All good news. Well, listen, really appreciate your insight on this. Thanks so much. My pleasure. Jacopo Bongiorno is a nuclear science and engineering professor. He's the director of MIT's Center for Advanced Nuclear Energy Systems.
Starting point is 00:12:08 In 2017, it felt like drugs were everywhere in the news. So I started a podcast called On Drugs. We covered a lot of ground over two seasons, but there are still so many more stories to tell. I'm Jeff Turner, and I'm back with season three of On Drugs. And this time, it's going to get personal. I don't know who Sober Jeff is. I don't even know if I like that guy. On Drugs is available now wherever you get your podcasts. The growing energy needs of AI data centers, they do present an opportunity to some. Alberta
Starting point is 00:12:44 wants to be a data center hub. It's trying to lure tech companies to build in that province. Jackie Forrest is the executive director of the Arc Energy Research Institute in Calgary. Actually, these AI data centers could advance this nuclear technology faster than it would have otherwise. And I think that can be a real positive in terms of moving to an emissions-free power grid. But on the other side, you know, we have a fairly small power market. That is a bit of a concern because of the power demand that will come with it. And although there'll be some jobs, you know, where their power bills go up, and will it cost Alberta more than the benefit? Now, some communities
Starting point is 00:13:19 in the United States have decided it's not worth it. Wendy Riegel has successfully fought against these kinds of data centers in Indiana. There had been a data center proposed right across the street, and it was going to be like eight buildings, 70 feet tall. They already have nine generators each. We don't want hyperscale data centers built in our residential neighborhoods because they're heavy industrial. And they demand all this power, our company would go broke and so would we. It's not worth it. What that would do to our community, I mean, it would just be horribly expensive for everybody. So we're not interested. Sasha Luccioni is watching this debate play out. She's an artificial intelligence researcher and climate-leaded hugging face, a company which
Starting point is 00:14:01 builds AI tools. And she joins us from Montreal. Sasha, good morning. Good morning. Can you just give us a sense, because I see these numbers and I frankly can't wrap my head around it, how much energy AI data centers actually consume around the globe? It's really hard to get that number because data centers tend to run everything in parallel, everything from email to Netflix streaming to AI. And we haven't managed to get exact numbers because essentially cloud providers, cloud compute providers are like,
Starting point is 00:14:28 well, we don't know what people are running on our servers. You know, it could be your Bridgerton addiction or it could be AI models. But we do know is that as we're switching out more and more tools and consumer-facing products with AI, we know that that's contributing to the growth. So we do have that relative portion. We just don't have the absolute numbers.
Starting point is 00:14:50 But we're talking vast sums of energy. So I read somewhere that it was as much energy as used by the entire country of the Netherlands. Yeah, there have been some estimates. Essentially, it's hard to say exactly. As a scientist, I tend to focus on the models themselves because the high-level numbers are hard to get. But we do know also that the computing chips,
Starting point is 00:15:12 like the GPUs that are specifically used for AI models, are more power-hungry than those used for normal streaming and web services. So that's also part of the problem, is that even if you have one chip for AI and one chip for Netflix, the AI chip will be consuming more energy overall. We just heard from Jacopo Bongiorno explaining why nuclear makes more sense than other kinds of power generation. What do you think of these tech giants turning to nuclear power? Well, actually, it's interesting because in the last couple of months, both Google and Microsoft have both publicly announced that they missed their net zero goals for the year. And actually, their emissions are rising and their energy use is rising, which is a first because companies set these goals themselves. So they tend to set goals that they can meet.
Starting point is 00:15:59 And so this year, they're like, oh, oops, like AI caught us by surprise. And so since I think May-ish, they've been scrambling to find new ways to meet those goals. I mean, the new goals that they've set. And I think that nuclear is the solution they went with. But as Jacopo said, it's going to take a couple of years for sure. I think even for Three Mile Island, I think the earliest was 2027. So there is going to be a couple of years when the demand is going to be outstripping any kind of renewable energy supply. But what's interesting is that actually tech companies for the last five years have been
Starting point is 00:16:30 the biggest purchasers of renewable energy credits, which is a way of offsetting energy. So say you use your energy in, I don't know, Singapore, where it's 100% coal-based energy, and then you can offset it with renewable energy credits that you buy in a grid that's not where you use your energy, but it's like a way of planting a tree, if you will. And so tech companies have been using these kinds of offsetting approaches for several years. And so maybe that they'll continue expanding that and essentially doing this creative accounting situation when it comes to emissions. You know, there was a time not all that long ago where we almost universally felt like tech companies were going to be the savior of humankind
Starting point is 00:17:08 and they were going to rethink how we did everything and it would take us to this much better place. And we've begun to rethink that in a lot of ways of what benefits and what harms tech companies and their products have brought and done to society. When we think about something as potentially volatile as nuclear energy being in the hands of tech companies, what concerns should we have?
Starting point is 00:17:29 Well, I think that tech companies have gotten away with the law just because it's a new area. And, you know, the approach tends to be a move fast and break things in startups and in Silicon Valley. And so what worries me is that approach transposed to nuclear energy, because nuclear energy is something that has to involve a lot of care, that you can't just like YOLO your way through a nuclear reactor. No, but honestly, like I do speak to a lot of people who are in the tech industry and even in AI, the approach is like, well, let's see what sticks, right? Well, like we know that Google does A-B testing to see how many ads people click if ads people click if they put them in this place or this place, right? There's a lot of these approaches that are very empirical, as it may. And this is a new, even machine learning, AI is a new field.
Starting point is 00:18:14 And so a lot of what we do as AI researchers is empirical. There's no literature. You can't go back 50 years. There hasn't been, the algorithms haven't existed for that long. And so what worries me is this approach and this, like, philosophy of, like, see what sticks and then transpose to a domain that has had disasters and has very rigorous safety protocols to deal with those risks. AI is still so new to all of us. We don't know what it's going to bring. We do know that it's going to just consume a banana's amount of energy and power from grids that are already tested. I wonder if you've thought much about how else could AI keep growing, but also keep reducing emissions if not using nuclear power? What I think that we should be focused on is really kind of the root cause, because nuclear energy is definitely a way to respond to the demand. But I think that we should be curbing the demand. And currently, we're in this AI boom where companies are trying to put generative AI, so really the AI that can
Starting point is 00:19:14 generate text or images, they're trying to stick that into anything that moves, anything that a customer can interact with. And I think we should rethink that. And we should also start mandating transparency. So for example, one question that profoundly irks me is how much energy does ChatGPT use or how much energy does a generative Google search use now that we have these little summaries that pop up? And at the end of the day, we don't know. We have no answers. And some of my work has looked into this. But what we should be mandating is transparency and also the ability to opt out, the ability for users to say, hey, no, I don't want WhatsApp, generative AI search, I don't want it forced upon me.
Starting point is 00:19:51 And then once we start getting more numbers and more ideas of how much relative energy this represents, I think governments can step in and say, okay, well, if you've got a model that is used by 10 million people a day, like ChatGPT, it can only use this much energy per query or per, you know, hour or whatever in order to be able to be used by like our citizens. Right. And I think that at that point, the governments can really start mandating more, more rigorous laws around this. You mentioned ChatGPT so that it's powered by OpenAI as a company and that OpenAI CEO, Sam Altman. It's powered by OpenAI as a company, and that OpenAI CEO, Sam Altman, he's argued that AI will be worth it because, at least in part, it's going to help us come up with the solutions to climate change. What do you make of that argument?
Starting point is 00:20:37 So I'm part of an organization called Climate Change AI. And essentially what we try to do is we try to bridge the gap between AI research and the climate change community and people doing, you know, biodiversity monitoring and et cetera. And I actually just came back from giving a talk at the Canadian Entomological Society about how AI can help insect monitoring. But at the end of the day, the approaches that are used for these kind of climate positive applications are not large language models. They're not chat GPT. They're really very, like, specific models, very efficient models that can run on your phone, on your laptop, that don't even need a data center.
Starting point is 00:21:06 So this argument of, well, we need to train ever bigger chat GPT type models in order to stop climate change doesn't make sense because these people are just using kind of like a random forest that they run on their phone. And oftentimes they don't even have internet access because they're in the middle of the woods, right? Meanwhile, as we introduced you, we heard from Wendy Riegel and from Jackie Forrest in Alberta, and they were both kind of questioning the cost of the load on power grids. What kind of a burden do these data centers put on local electricity grids, which, as I say, are being tested these days? I think that the load, the problem is that we have energy grid operators have the historical kind of data for normal loads like heating and cooling. And they know that when it's hot, the load tends to go up, etc. And people sleep at night, and so the load goes down.
Starting point is 00:21:57 And there's kind of trends like this. And a lot of the models that they use are actually either very light AI models or mathematical models, just kind of like the more old-fashioned ones. Whereas hyperscale data centers, though, the ones that run these AI models, they use a lot of energy, and it's 24-7. And it's essentially, you don't have these down periods, you don't have these cyclical processes. And I think that's really hard to deal with for grid operators. And this is why nuclear has emerged as a candidate, because, for example, solar energy, if you wanted to have a solar farm that's powering a hyperscale data center, that means you need to have a massive amount of batteries for the nighttime when you
Starting point is 00:22:35 don't have solar energy, right? So you can't have this 24-7 output from these kind of weather dependent renewable energy sources. And that was initially the pitch from jurisdictions like Ontario and Quebec that have a ton of hydroelectricity, that they've just got this steady renewable energy and that that'll be enough. Why then do you think, or what do you make of Alberta's pitch in trying to get some of these data centers to move there? So it's true that, I mean, Ontario has a fair amount of nuclear, but Quebec and Ontario both have hydro, but we're nearing capacity.
Starting point is 00:23:06 Like we're not at 100% capacity at all. But building one or several more hyperscale data centers can push us towards that capacity. And so either we need to make the existing hydroelectric dams more efficient, which is possible, definitely. But you need to put in a lot of work. Once again, it's not going to happen overnight. which is possible, definitely, but you need to, like, put in a lot of work. Once again, it's not going to happen overnight. Or we need to build new dams, which takes, you know, really can take a decade depending on getting the permits and all that. And so, I think that if you want to really have this flip the switch, an existing nuclear reactor like Three Mile Island might be
Starting point is 00:23:37 the shortest possible time frame for that. And for Alberta, if you have energy sources like natural gas, well, then exactly, you can just, You can just pump more natural gas into your power plant and get more output. So definitely it's the easiest in that sense. Before I let you go, AI is your field. And I wonder, when you have these kinds of conversations with more general audiences like us, how do you balance out the tradeoffs and the benefits of AI? So I struggled with this a lot. And actually, so I started out in the climate positive applications. And then someone came up to me and they're like, well, what if you're doing more harm
Starting point is 00:24:11 to the planet than good with your work? And that made me pause. And so for the last five years, I've been working on understanding AI's carbon footprint, AI's energy use. And I think that at the end of the day, it's like with anything, you know, when we have a choice, whether we take the bus, the train, the plane, the car, you know, we have some numbers. We have some, like, general idea of how much the carbon footprint varies or energy usage. And we start having habits.
Starting point is 00:24:35 And I think that with AI and new tech in general, it should be the same. Like, for example, someone recently told me, oh, I don't even use a calculator anymore. I use Chad GPT. And I was just like, you know, that's like 100 hundred thousand times more energy. It makes no sense. But on the other hand, you know, if you do want to, whatever, brainstorm some cool new titles for your research paper, which is, which I do a lot, um, because I'm bad at coming up with funny titles and you sure using Chad GPT once a month makes sense. Right. And so it's really about kind of thinking critically about the tech and in France, they talk about, um, digital sobriety, which I really like the term.
Starting point is 00:25:07 So adopting this stance of digital sobriety. All right. We're going to have to leave it there. Sasha, thank you for this. Really appreciate it. Yeah, thank you. Sasha Luciani is an artificial intelligence researcher and climate lead at Hugging Face in Montreal. For more CBC podcasts, go to cbc.ca slash podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.