The Current - How much energy did your ChatGPT prompt use just now?

Episode Date: September 3, 2025

Google is one of the first big tech companies to go public with how much energy it takes to use its AI tools.. AI’s carbon footprint is big - so how does Canada balance the energy guzzler with AI so...vereignty?Tech reporter Casey Crownhart digs into the massive electricity demands of an AI future, and what it means for the climate. Plus, we talk with Phil Harris, President and CEO of Cerio, and Kate Herland of the Canadian Climate Institute, about what Canada can do to make AI data centres here more sustainable.

Transcript
Discussion (0)
Starting point is 00:00:00 It's not just you. News in Canada and around the world is moving at an incredible pace, which is where we come in. I'm Jamie Poisson and I host Frontburner, Canada's most popular daily news podcast. And what we try to do is hit the breaks on a story that you actually want to know more about. So try us out. Follow Front Burner wherever you get your podcast, Front Burner, stories you want to follow five days a week. This is a CBC podcast. Hello, I'm Matt Galloway, and this is the current podcast. You know, you can ask AI tools just about everything. What's the episode of Star Trek, the next generation, where nobody on the ship can see them?
Starting point is 00:00:45 I want to expand my painting company service offerings. Act as my business advisor and list three complementary services we could provide. Can you update me on the most? recent news on tennis that happened in Toronto along with the weather I'm expecting today. Simple AI prompts like those ones produce instant results, but behind even basic answers is a lot of computing power, and that in turn is powered by electricity and processed in data centers. Up until now, the big tech companies weren't really saying publicly how much electricity is actually consumed when people use AI, and that has been a concern, because as the industry
Starting point is 00:01:26 grows, so do the demands on power grids around the world and also potentially the carbon emissions associated with generating all that power. But now Google has released publicly estimates about the energy consumed when people use its Gemini AI. Casey Crownhart is the senior climate and energy reporter for the MIT Tech Review. Casey, good morning. Hi. Thanks so much for having me. Thanks for being here. What did we learn from the data that Google shared? We'll get into some of the specifics, but just broadly, what was your big takeaway? Yeah, I mean, I think this was the most transparent estimate yet that we've seen from one of these big AI players. So, I mean, just off the bat, it's really, I think, a good move.
Starting point is 00:02:11 Ultimately, the amount of energy that one query to Gemini uses, a typical one, you know, it falls within the range of some other previous estimates from researchers and some of our own reporting. But it's a pretty small amount. If you're just looking at one kind of typical query. It's, you know, about 2.24, 0.24 watt hours, which is about the same as using a microwave for about one second. A microwave for one second. That's just for a simple query. Exactly. Yeah. So that wouldn't include something like an image, generating an image or a video or using these kind of more complicated models, but one of those kind of easier ones, like you said. And the point is that, I mean, people use AI for all sorts of things. It's not just asking for
Starting point is 00:02:51 jokes. It's, you know, creating a travel itinerary. It's making a video. It's making an image. And so in looking at what is being consumed, what did you learn through getting, I guess, a better handle on how much AI energy is being consumed? Yeah. So through our own reporting, you know, we worked on this big project at MIT Technology Review for through the first half of the year. My colleague James O'Donnell and I found that there is this really big range. You know, we were looking at open source models, so things that are publicly available. And we found that, you know, these text models, it can be even smaller than this estimate from Google down to, you know, almost a tenth of that energy consumption, up to, you know, many, many, many times more if you're generating a high quality video. So I think we all want kind of one easy number about, you know, what, what, how much energy is it when I ask AI for something?
Starting point is 00:03:47 But ultimately, what we found is that it matters what kind of the guts of the model, what those look like and what that something is that you're asking. AI to do. So there was a scenario, and we talked a little bit about kind of some of the examples and how people might use AI, but there was a scenario that you set out involving what, a hypothetical person doing a charity fundraiser and they were using AI to help with that. Walk us through that and talk about how much electricity would be used. Absolutely. So if you say you were running a marathon and you want some help with, you know, asking for funds for your social posts. So maybe you ask a text model 15 questions to kind of get some ideas about how to fundraise. You generate 10 images to get a couple to post on something like Instagram. And then you try to make a five second video. And you do
Starting point is 00:04:38 that a few times until you get one that you're happy with. Altogether, that would add up to 2.9 kilowatt hours of electricity based on our estimates. And that's enough to run a microwave for about three and a half hours. So, you know, like I said, it really depends that the video really drives a lot of that based on our estimates. But these things can add up, for sure. What did you learn about the source of the power? I mean, that's important as well, right? Absolutely. So, I mean, this would be one conversation if, you know, all data centers were hooked up just to wind and solar and ran totally clean, but that's not the reality today. One pre-print study from Harvard's public health school found that in the U.S., on average,
Starting point is 00:05:21 data centers use 48% dirtier electricity than the average. That's partly because in the U.S., they tend to be concentrated in areas that have a lot of coal power, a lot of natural gas. So the emissions here are definitely a cause for concern. As we see these data centers pop up and use a lot, a lot, a lot of electricity that's coming with a lot of emissions. Also, not just electricity. Water is used to help cool the computers that are in these data centers. And in reading your report, I mean, each prompt would consume something like five drops of water. Is that right? Yep, five drops of water. You know, it's a fraction of a milliliter. So, you know, ultimately, again, this estimate shows that every one of these queries is maybe a small
Starting point is 00:06:08 amount of electricity, of emissions, of water. But that can be, you know, a really big, again, it can add up to a lot when you're in water-stressed areas in particular. We are in very early days, it feels like, of the AI eruption. People are talking about how much this is going to grow, how exponentially it will grow. Beyond the individual actions, what are the estimates about how much energy AI will need in the years ahead? Yeah, projections at this point are definitely set to take off. The International Energy Agency this year released a report, and they found that electricity use for all data centers, so AI is only a subset of that. But electricity use of data centers is set to double between now and the end of the decade, so just a few years from now.
Starting point is 00:07:02 And they projected that it would reach 945 terawatt hours, which is the entire electricity consumption of Japan today. And AI is definitely a significant force there with AI-specific. demand set to quadruple. So, you know, these are big numbers. And it's also in places like, you know, the U.S. and these, you know, advanced economies, data centers are going to be a lot of the growth in electricity demand. You know, we're seeing we, every, everything needs more electricity from EVs to air conditioners to factories. But data centers will be a really big chunk of that in certain parts of the world. As a climate reporter, what do you understand about what that means for carbon emissions?
Starting point is 00:07:44 I mean, it means that we have a big task ahead of us to really consider how this technology gets rolled out, how, you know, data centers, how we expect them to work with the grid, how we expect them to, you know, be part of hopefully cleaning up the grid. This, this all, you know, if these electricity estimates come to pass, that's hundreds of millions of metric tons of carbon emissions every year. So, you know, it's a little bit daunting, but I think I hope that we can look at this as an opportunity that these companies can be part of, you know, trying to decarbonize our grid and try to be part of the energy transition. I guess just finally, before I let you go, is your sense that those companies, I mean, Google release this information, you have governments that are shoveling money into AI and creating these data centers, luring the companies to build their data centers there, are governments and these tech companies actually paying attention to the energy use that's associated with this technology? I think so. And I think that, you know, there, ultimately this came from a lot of questions. Google said, you know, we've been getting a lot of questions, including. from are pestering at MIT Technology Review. So I think that, you know, if people care about this,
Starting point is 00:09:01 they should continue asking, you know, the governments and companies to, you know, be more transparent, to share more information and to ultimately, like I said, be a driving force for the energy transition through this. Thank you for your pestering. And thank you for talking to us about the results of that.
Starting point is 00:09:17 Casey, thank you. Thank you. Casey Crownhart is Senior Climate and Energy Reporter for the MIT Tech Review. A lot of news podcasts, give you information, the basic facts of a story. What's different about your world tonight is that we actually take you there.
Starting point is 00:09:33 Paul Hunter, CBC News, Washington. Margaret Evans, CBC News, Aleppo. Jerusalem. Prince Albert. Susan Ormiston, CBC News in Admiralty Bay, Antarctica. Correspondence around the world, where news is happening. So don't just know, go. I'm Susan Bonner.
Starting point is 00:09:49 Host of Your World Tonight from CBC News. Find us wherever you get your podcasts. Here in Canada, there has been a lot of talk in recent months about the need to build our own data centers and get in on that AI boom. But there are also questions about what that means for our electrical grid and climate. Phil Harris, as president and CEO of a company called Serio. It works with tech companies to help make data centers more efficient. He's with me in studio. Good morning.
Starting point is 00:10:16 Good morning, Matt. How difficult is it? I mean, Casey was talking about this disclosure from Google, but how difficult is it to get a grip? on how much energy these data centers are actually using. Yeah, I think this is, and I try to put this in sort of terms that everyone can understand. Let's take chat GPT, we would talk about Gemini, but let's take chat JPT, probably the more well-known generative AI platform. If we think from December of 24 to today, about nine months, from chat GPT to chat GPT-5, the usage has gone up from one billion of those prompts a day to about two and a half billion prompts a day. The power required, though, has gone up 60-fold.
Starting point is 00:10:54 Why? Because AI is becoming more complex. There's a genetic AI. There's different types of AI now that are being required in these platforms. So I think we have to look at the overall impact and think, what's this scale going forward and how do we address that? Just to put that in context, if we would have dropped CHAPT-5 into Ontario today with our power grid, it's the equivalent of about a million very efficient gasoline-powered cars a day. That's a lot. That's a lot. And that's going to grow more tomorrow, more after that, more after that. And that's just chat GPT. In this country, are we thinking, I mean, in case he was talking a little bit about how these countries and jurisdictions that are luring AI have to think about this.
Starting point is 00:11:37 Are we thinking about this enough, do you think here? We are. And I think it's something that both at the provincial level here in Ontario, we're here in Ontario, but I think across other provinces too. And certainly at the federal level, you know, the appointment of Evan Solomon as the AI minister at the federal level, Stephen Crawford. here at the provincial level, are taking things very seriously. I'm working with companies like core data centers in Brampton here again in Ontario
Starting point is 00:12:00 who are really trying to figure out how to build that really efficient data center that is not only effective, but so it's competitive because we're competing with some very, very deep pockets. Is it possible that that power can be sustainable? That what's fueling these centers can be sustainable is this word that different people have different interpretations of. But is that possible? Well, I think first of all, we've got to figure out,
Starting point is 00:12:22 are we building efficient data center to start with? And that's what Serio's mission is, is to try and make sure that we're using the technology that makes AI happen. AI's been around since the 1950s. The first AI languages that were used in software came around in the late 50s. Then GPU technology from companies like Nvidia
Starting point is 00:12:37 came around in the late 90s. That only really came together in the last five, six years. So as we've put these technologies together, we're using a 30-year-old system model to power AI. We've got to look at that system model and that's what we've attempted to do
Starting point is 00:12:51 at Serio to build technology that can really optimize those data centers. How are you doing that? How does your company help these companies try and use less energy if that's possible? The first thing is are we using these very expensive resources in the most effective way? If you look at a typical data center, we spoke about these huge data centers. Typically, they're about 60% efficient, meaning at any one time, 60% of the equipment in there is being used, 40% sitting idle or redundant or stranded is another way of looking at it. What if we could make that 100% available? That's what we do.
Starting point is 00:13:22 We make sure that all the resources are 100% available at any point in time. So if you're going to be using, use it to its most efficient ability. And use it to the point where the system itself can evolve fast and using new technologies without us having to do these big forklift upgrades, as we call them, every time. Because they're expensive, they're disruptive. And quite frankly, it means that data centers tend to be over time ever more inefficient. We should make them ever more efficient as time goes by. Are those companies actually thinking in that way?
Starting point is 00:13:53 I mean, there have been nightmare stories of, you know, these enormous data centers. And water is a part of this, right? Suddenly people's wells go dry because we're sucking up all the water to help fund the technology and fuel the technology of the future. People worry about, you know, whether the lights are going to be able to come on because we're using so much energy to help power this tech. Are those companies, there's a real, there's a lot of money that's here. And so you wonder whether those companies are actually thinking that this is important
Starting point is 00:14:18 or whether the technology will figure it out down the line. So I looked at the investments. I mean, look at Microsoft. They've just taken a 20-year lease on 3-mile island and nuclear power generation. So basically these companies are becoming power generation companies themselves because they're recognizing the grid won't keep up. And however much investment we put into the national grid to distribute power generate and distribute.
Starting point is 00:14:39 By the way, Canada is an immensely efficient place to build data centers compared to, say, California. we're about a tenth of the emissions of carbon dioxide into the air compared to San California, which is one the most efficient in the U.S. You mentioned Evan Solomon. There's a lot of talk in this country right now. The fact that we have a minister of AI is one thing, but there's a lot of talk of digital sovereignty, that we need to be able to control our own.
Starting point is 00:15:03 And you're nodding as I'm saying that. We need to be able to control our own data and be in the race, be in the game. Yes. What does that mean in terms of how much energy, how do you square that? terms of how much energy is going to be used, if we are going to be aggressive as a nation to try to build these data centers? Well, I think, and I've spoken to Minister Solomon about this, small modular reactors so we can distribute more energy and create more scale of energy, is certainly critical. But if I look at data centers themselves, I think, again, we've got to
Starting point is 00:15:33 figure out, are we going to do very small, sorry, large numbers of very concentrated data centers, or are we going to distribute them? And I think the question is, can we distribute data centers across Canada because data sovereignty isn't just about where the data is. It's your accessibility to the data. It's a reliability of that data. Reliability becomes really important here. We've got to rethink that. And I do think the provincial and federal government take this seriously. Governments don't work as fast as we would like them to, certainly at the pace of technology. They're trying hard, and I give them credit for the pace that they're trying to get to, but there's a lot of catch-up to do here. What do you make just finally of the
Starting point is 00:16:08 arguments, I mean, we're talking about the climate impact of this energy consumption. The arguments that, you know, Eric Schmidt, former CEO of Google and others make that he says, we're not going to make the climate goals anyway, but he's betting on AI to help try and solve the climate crisis. I mean, he has a number of layers of skin in that game. One is the investment that he has put in and the concern that this is creating and exacerbating the climate crisis. But is that possible? Do you think that this technology will help us? You know, it's a funny, it's a bit of a recursive argument to use AI to solve the problems of AI. We'll only exacerbate the problems of AI if we're not thinking about how we build AI differently.
Starting point is 00:16:48 So I think he's right if we take the right approach, which we haven't been taking up to now, but we can if we invest in the right technology. And you honestly believe that that's possible. Again, to build at scale, but also to build at scale and find efficiencies, they're really going to make a difference. If we don't, then the digital divide that we saw, the industry, internet in the early 2000s is going to be exponentially worse. If we think about what AI is doing for the world, we have to make this a win story. And to do that, again, we've got to take a step back and say, where are we putting investments? Both private and public investments have to be hand in hand to do this right. And not destroy ourselves in the long run.
Starting point is 00:17:24 Exactly. Phil, thank you very much. Thank you, Matt. Phil Harris is president and CEO of Serio. It's a company based in Canada, Ontario. It helps tech companies make their data centers more efficient. Kate Harland does research on clean growth at the Canadian Climate Institute. She is in Langley, British Columbia. Kate, good morning to you. Good morning. You've been listening.
Starting point is 00:17:44 How well equipped do you think this country is to deal with a potential boom in data center construction here? Well, obviously the demand for energy use from AI is growing. It's kind of a step up on our digital world and our potential footprint and how much electricity we need. But I guess the real question we have is, is how do we choose to power? or any data centers here in Canada. We're increasing interest in that for a host of reasons.
Starting point is 00:18:12 But these data centers, they run 24-7 typically. And so there's a lot of consumption, electricity consumption that's needed to supply these centers. I'd ask Phil about that word sustainable. Green is another one of those words that, I mean, the definition is in the eye of the beholder in some ways. How much of that energy being used to power data centers could be considered green energy?
Starting point is 00:18:33 Yeah, I mean, we, and from across Canada, the grid is more than 80% non-emitting. There's different definitions of clean and green, but in terms of emissions, we have a lot of hydropower, and what we've actually seen is provinces like Canada moving off coal-fired power, fast-expected, building out wind and solar. And so that's been a bit of a success story in Canada, that we've reduced our emissions substantially over the past decade in terms of. of the emissions from our electricity sector. The question now is, though, if we have new large loads coming on, like data centers that are supporting AI, if we build new gas-fired power stations to support those centers, then they use that power all day, every day, not just for kind of peak needs,
Starting point is 00:19:22 we could lock in quite a large amount of emissions for decades because a new plant will you expect to last sort of over 40 years. I mean, this is happening in Alberta right now, where you have a data center strategy that revolves around electricity that's powered by natural gas, right? Yeah. I mean, there's a lot of interest in Alberta for data centers, and there's a long queue of potential projects. So if you built, for example, all of those projects, and they were all fired with new gas
Starting point is 00:19:51 plants, then that would raise the kind of hard-win gains that we've seen in terms of phasing out coal in Alberta. So there's a real issue here, but fortunately, we do have a clean path. forward. There are alternatives. We can pair new wind and solar with batteries so the data centres can run for the most part on clean electricity, maybe not for all peak times, but for the bulk of their use. And we can also look at how we better we connect provinces. So we connect Alberta with D.C. or Quebec and Manitoba, which also have good hydro resources with their neighboring provinces as well. Is that realistic to rely on renewables for something that is going to be consuming
Starting point is 00:20:31 this much energy? Well, one of the advantages of renewables today is that they're really fast to build, compared to, say, a new gas plant, they're actually quite a long wait time right now for new gas turbines. But just in terms of scale, just in terms of the amount of energy that's going to be gobbled up. Yeah. Well, one of the things we need to think about is how do we run our electricity systems?
Starting point is 00:20:55 How do we manage the electricity effectively? And what we have right now is we have big hydrodamble. in some parts of the country that can effectively act as batteries and can support wind and solar. So we can kind of play a little bit with our variable wind and solar and our hydro and use our grid more efficiently. And that's a way for us to manage an increased supply of electricity through more efficient use of our system that we do have across the country.
Starting point is 00:21:24 I mean, there are big decisions to be made here, right? Because more of these data centers will be built. and there are enormous economic opportunities in terms of where those centers end up. What do you want governments to be thinking about as they wrestle with that decision-making? Exactly. I mean, there is a question.
Starting point is 00:21:43 What we have, I think, we want to make sure is that the electricity remains reliable, it remains affordable for, and so one of the questions I have is, from a system perspective, is let's make sure that data centers pay their fair share of costs of building out the system. that will to help make sure that that happens. And let's ask beta centers to be good customers effectively.
Starting point is 00:22:06 So maybe they can cut their use at peak times and shift some of their flexible computing to off peak times. We've actually seen this happen a couple of times. Microsoft had an example in Quebec with some agreements there about reducing their consumption when in the coldest winter days when the electricity system is most strained and reducing the consumption there. And that really saves everybody. It saves the system money, and it helps with reliability as well. So we can ask them to be good citizens, yeah, for the grid perspective, too. What about just finally for individuals?
Starting point is 00:22:42 I mean, we started this conversation and just talking about how much energy, a simple query or a video that you're creating relies on. Do you think people think at all about the energy and climate impacts when they hammer something into chat GPT? It's a question I've been posed a lot recently, actually. I think interest is growing. I think more, there is more awareness about even our footprints of our emails and our photos. But I think there is a question here. There's some bigger levers here we can ask for how are we powering these centres that will make it.
Starting point is 00:23:20 We need more electricity in the energy transition. Even if these energy centers and these state centers become more efficient over time, we'll still need that electricity for other tech for other uses as well. So from a listener perspective, let's ask the new electricity supply to be clean, and that will help us as we transition and also make sure that we have more available power, whether it's for data centers or for other uses, when those data centers become more efficient, hopefully, in future. Kate, good to speak with you. Thank you very much. Thank you.
Starting point is 00:23:53 Kate Harland, researches clean growth at the Canadian Climate Institute. she was in Langley, British Columbia. You've been listening to the current podcast. My name is Matt Galloway. Thanks for listening. I'll talk to you soon. For more CBC podcasts, go to cbc.ca.ca slash podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.