Science Vs - AI: Is It Ruining the Environment?
Episode Date: November 13, 2025The internet is abuzz with accusations that artificial intelligence is using up tons of energy and water. People are even protesting the building of new AI data centers, saying they’ll put a huge st...rain on local resources. But some AI defenders say that this fear is overblown and that AI isn’t actually that bad for the environment. So who’s right? We talk to science and tech reporters Casey Crownhart and James O’Donnell, and computer scientist Prof. Shaolei Ren. Find our transcript here: https://bit.ly/ScienceVsAIEnvironment Read James and Casey's article here: https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/ Check out the Mythbusters GPU/CPU demonstration here: https://www.youtube.com/watch?v=WmW6SD-EHVY In this episode, we cover: (0:00) Chapter One: No More AI For Dank Memes?! (3:34) Chapter Two: How Much Energy Does Your AI Query Use? (15:37) Chapter Three: How Much Energy Does AI Use Total? (21:18) Chapter Four: Is AI Drinking All Our Water? (29:29) Chapter Five: Should You Quit Using AI? This episode was produced by Rose Rimler and Blythe Terrell, with help from Meryl Horn and Michelle Dang. We’re edited by Blythe Terrell. Fact checking by Diane Kelly. Mix and sound design by Bobby Lord. Music written by Emma Munger, So Wylie, Peter Leonard, Bumi Hidaka and Bobby Lord. Thanks to all the researchers we reached out to, including Prof. Melissa Scanlan, and special thanks to Andrew Pouliot and Jesse Rimler. Science Vs is a Spotify Studios Original. Listen for free on Spotify or wherever you get your podcasts. Follow us and tap the bell for new episode notifications. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Hi, I'm Rose Rimler. I'm filling in for Wendy Zuckerman, and this is Science Versus.
This is the show that pits facts against filling the world with AI data centers.
Today on the show, AI and the environment.
Lately, we've been hearing a lot about how power-hungry AI is.
AI uses a shit ton of electricity.
Straining the nation's aging power grid and creating more.
more planet warming emissions.
And how thirsty it is.
The amount of water that AI uses is astonishing.
Asking chat GPT to write one email is the equivalent of pouring out an entire water bottle.
One bottle of water.
Let that sink in.
And the major culprit here is the data centers, warehouses, full of computer servers that AI needs to function.
And tech companies are trying to build more of these data centers.
But people who live nearby are protesting them, often saying that they're going to
going to compete for their electricity and use up their water.
We do not want a data center built in St. Charles City.
Because of all this, around some corners of the internet, using AI has become kind of a faux
paw, especially if you use it for something silly.
You are actively contributing to global warming and climate change, all because you want to
Photoshop Chris Brown into your picture.
So the next time you use AI to generate an image.
for a meme, think about the impact on the environment first.
Stop ruining your planning for a fucking Instagram post.
But on the flip side, we've got people pushing back against this idea.
They say that these reports are skewed or misleading,
and that the impact of AI on the environment isn't nearly as bad
as a bunch of other stuff we're already doing,
like eating meat or taking international flights.
In fact, recently, some of the big AI companies have said
that their products only use a tiny bit of power and a few drops of water for each prompt.
So what's really going on here? Is AI actually ruining the planet? Or have the bots been framed?
Because when it comes to AI and the environment, there's a lot of...
Stop ruining your planet for a fucking Instagram post.
But then there's science.
And that's coming up after the break.
Boarding for flight 246 to Toronto is delayed 50 minutes.
Ugh, what?
Sounds like Ojo time.
Play Ojo? Great idea.
Feel the fun with all the latest slots in live casino games and with no wagering requirements.
What you win is yours to keep groovy.
Hey, I won!
Boating will begin when passenger Fisher is done celebrating.
19 plus Ontario only. Please play responsibly.
Concerned by your gambling or that if someone close to you, call 1-866.
531-6-00 or visit connexonterio.ca.
You may have heard of the sex cult nexium
and the famous actress who went to prison
for her involvement, Alison Mack.
But she's never told her side of the story until now.
People assume that I'm like this pervert.
My name is Natalie Robamed and in my new podcast
I talked to Allison to try to understand
how she went from TV actor to cult member.
How do you feel about having been involved
in bringing sexual trauma at other people?
I don't even know how to answer that question.
Allison after Nexium
from CBC's Uncover is available now on Spotify.
Welcome back. I'm Rose Rimler. I'm a senior producer at Science Versus, and I'm here with
our editor, Blythe Terrell. Hi, Blyde. Hey, Rose.
Blyde, it seems like you're my AI buddy. I invite you to talk to me about controversies when
it comes to AI. It's because I'm part robot. I would say that of the team, you're the person who is
like most tuned into this idea about AI, using up all the energy, using up all the water.
You've been into this for a while.
Yes, this is actually, I am one of those people who probably shared a meme, Rose,
without knowing if it was true on the water use or whatever.
Like, I do remember seeing those memes and being like, is this true?
And then honestly, for me, it did make me take a step back from AI and be like,
before I get involved in this, I do want to know the truth.
Like, is this actually terrible for the environment?
Is this actually terrible for the water?
Because why would I want to integrate it into my life if it is?
Right.
So the first question is, why do we think AI would use so much more energy
than all the other stuff that we do in our digital lives?
Just like messing around on the computer, posting on Instagram, watching Netflix.
Like, this is all stuff that we do pretty routinely and don't think a lot about the, like, footprint of that behavior.
Right.
Like looking at pictures of Jeff Goldblum.
How much energy is that using?
Just as an example, hypothetically, photoshopping Jeff Goldblum as your prom date.
You know, I don't know who you might be talking about, Rose, but that sounds like a pretty good use of electricity and energy, no matter, and water, no matter how much it takes.
So, that kind of stuff also requires data centers and energy to run them.
Mm-hmm.
But the thing that's different about AI is that their servers are using a different kind of computer chip.
So normal computing uses a CPU, but AI uses a GPU.
And if you're familiar with video games, you might think of this as like a graphics card.
But it's actually become the powerhouse behind machine learning.
This is an extremely visual and potentially copyrighted analogy, but I was looking around on YouTube.
I love a copyrighted analogy.
The MythBusters guys, they did a demonstration of a CPU versus a G.
And in their demonstration, they used paintball guns.
Okay.
The CPU was like programming one paintball gun to draw a happy face with like one paintball
pellet at a time, like firing at a piece of paper on a wall.
Okay.
That's a CPU.
A GPU was like 200 paintball guns all bound together.
Making it like a mega paintball gun.
And like one switches hit and they all fire at once.
the image that they create is the Mona Lisa.
Oh, man, those guys are good.
MythBusters.
So the point is that while CPUs are good at doing one task after another, the GPUs are good
at doing a bunch of tasks at once.
And that requires a lot more energy.
Okay.
How much energy?
Are you ready for that?
Yeah.
We are ready.
And I got a bit of an assist here.
I talked to some journalists who have covered the stuff for years.
My name is James O'Donnell.
I'm a senior reporter for AI.
at MIT Technology Review.
I'm Casey Cronhard.
I'm a senior climate reporter
at MIT Technology Review.
So James and Casey both report a lot
on AI and energy use.
And about a year ago,
they started a project
trying to figure out
like how much does an average query
or prompt to say,
chat chit,
how much energy does that use?
And they were inspired to do that
because they were seeing
all these numbers out there
floating around
that just didn't seem
all that reliable.
Here's Casey.
These kind of wild estimates of, you know, oh, a query to something like chat GPT uses this much water and this much energy and isn't that so much.
And so I think that that started to kind of get our gears turning and wondering, you know, is that right?
How can we add all of this up?
What does it all add up to?
So Casey and James looked around for the real number, but...
We learned very quickly that it's not going to be so easy to know that number.
companies, they're not particularly willing to share the details of how much energy
their AI models require to answer one question. And so, you know, we weren't going to get it
from them. And so... What do they say when you reached out and asked? They said, in so many
words, no. So James and Casey went a different route. There are AI models that are not
proprietary. Anybody can use them, even download them, host them on their own.
computer as long as they have the power to do that. These are open source models. And you can kind of
open the hood, poke and prod them. And so James and Casey teamed up with experts, including
academics at the University of Michigan, to measure it themselves. So they ran a bunch of different
prompts through an open source large language model called Lama. And then they were able to actually
measure how much energy those requests required. And so they got some answers.
Are you curious?
Yes, I would love to know. Give me the answers.
Well, there's a range here.
I was really struck throughout this project of, I think we went in and I was looking for kind of one definitive answer.
You know, like what is AI's energy burden?
And I think that one of my biggest takeaways was just how much it depends.
It depends on the model.
It depends what you're asking.
And so there's just this really big range.
That was one of my biggest takeaways.
So, okay.
You know, basically when people like us say, I asked AI, you know, we kind of act like
AI is this one thing.
And it's totally not.
There are all these different models.
And these models come in different sizes.
So a model, sorry, a model is like a chat GPT or a Gemini or a Claude or whatever.
Yeah.
And they're within Chad GPT, Gemini Claude.
There are multiple models.
Within Lama, there are multiple.
And they're, some are bigger.
some or smaller.
If you imagine that this AI model,
like imagine it's the command of a spaceship
or actually my favorite is like a switchboard
with tons of knobs and dials,
you can kind of imagine that's what these parameters are.
James says...
Each of those knobs is helping the AI come up with a better answer,
but also each of those knobs requires energy to operate.
So the smallest model that the team looked at for this analysis had 8 billion parameters.
So 8 billion dots.
That sounds big.
The biggest one they looked at had 400 billion parameters.
400 billion?
And when it comes to the big players here, we actually don't know how many parameters they have.
But James said if he had to guess, it's...
You know, in the order of trillions.
Whoa.
Really big.
A lot of knobs.
A lot of knobs.
I mean, I'm imagining basically like a switchboard, but now I have to completely change it
because it's like a switchboard that goes on for miles.
Yes, that's right.
So these parameters you're talking about, which is like sort of the, what underpins the model, I guess.
They are, yeah, they're numbers, values, and the more of them that there are,
the better the model is at learning patterns and making predictions, which is how,
large language models work.
Okay.
Okay, so say I have like one request
and I pop it into a model
with 8 billion parameters.
Uh-huh.
And then I pop that like same request
into a model with like 400 billion parameters.
Uh-huh.
That same request is going to use
different amounts of energy
based on the model that I'm using.
Yes.
And actually we can move into
we don't even have to hypotheticalize here.
I have real numbers for you.
Oh, nice.
Okay.
So the smallest
Lama model that the team used, they fed in some prompts like, teach me about quantum computing
or suggest some travel tips. And then they measured how much energy that used. The smallest model
when it spit out an answer used, on average, 114 joules. Oh, great, great. That's very helpful.
Are you being sarcastic? Do you want some other way to think about this? Yes, please give me something
Well, I didn't, you know, it was James and Casey.
So they came up with something for some context.
One thing they converted these energy units into is a fun new type of unit called microwave seconds.
I love the microwave seconds unit.
It's so much more relatable than joules or watt hours.
Casey gets me.
All right, so 114 joules is roughly a tenth of a second in a microwave.
Okay.
So that's one query, small model, a tenth of a second.
In the microwave.
Yeah.
Okay.
The biggest model, which was 50 times bigger, that was like zapping something in a microwave
for eight seconds.
Oh.
So that's the biggest model was still only, for one query, the biggest model was still only
eight seconds.
Okay, that doesn't even get my rice remotely hot.
Right.
I mean, and after James and Casey published their article, frustratingly for them,
open AI and Google did release a little bit of information on how much energy
their text prompts use on average, and what they said suggests that a text query is equivalent to
one or two seconds in the microwave. Okay. So basically what we can tell you is like a text prompt
to a large language model is probably on the order of zapping something in the microwave for
less than 10 seconds. Okay. And then for images, this is a different kind of machine learning,
but it also uses a fair amount of energy. And I would have assumed that this image making,
thing is inherently more energy sucking than text making. But as it turns out, that is not
necessarily the case. Here's James. If you have a really big large language model that's generating
text and answers, it may actually use more energy than generating an image. And that was
kind of counterintuitive for me because you know, you think about like these AI models that
come up with fantastical images that we've all seen over the past few years. And it just seems,
seems like such an intense process to kind of create that from scratch.
Yeah, because it always takes longer, too, than getting your text back.
Yeah, exactly.
But what we found was that if you have a really large text model, it has so many parameters,
so it has so many knobs and dials that it actually can use up more energy than generating,
you know, certain types of images.
They found that making an image was like running a microwave for five and a half seconds.
So like a big language model can be like eight seconds of microwave.
time. So that's a little less. Okay. Right. I'm with you. And you know, for all this stuff, if you don't
like microwave time, you could also think about it in light bulb time. So it's like running an
LED light bulb for somewhere between 10 seconds and two minutes. So I guess, Rose, what this maybe
tells me is that if I wanted to make my Jeff Goldblum prom picture with AI, that is a slightly
less energy-intensive process than perhaps using AI to write romantic Jeff Goldblum fan fiction.
Possibly if you use a really big model to write your Jeff Goldblum fan fiction.
Obviously, I would need a very large model for this work, Rose.
Okay, got it.
And then there's video generation.
This might not surprise you to hear that that used the most energy.
So the team, what they did was they looked at an open-source video generation model.
And they just made like a crap video.
It was 16 frames a second, five seconds long.
They compare it to like the quality of a silent film era type film.
Mm, okay.
And that one...
That would be the equivalent of over an hour in the microwave.
I don't think I've ever microwaved anything for an hour.
I don't think I have either.
A long time in the microwave, for sure.
That one scares me more because I'm seeing a lot of AI-generated videos out there.
Yeah, that's what's huge right now and getting bigger, right?
There's a ton of, what is it, SORA?
Yeah, SORA is big right now.
Yeah.
And they can look incredibly realistic, right?
Right.
We don't know if that's also using up as much energy.
We asked Open AI, which makes SORA,
and they didn't give us any information on Sora's energy use.
And James and Casey didn't want to speculate.
It's a different model.
But it's probably using a fair amount of electricity.
I think that's safe to assume.
So, I mean, but, okay, so what we have so far is all about, like,
individual use.
Yeah.
But what I want to know, though, is like, obviously lots of us are doing this, lots of
us are using this.
Like, what is the impact if you, like, add it all up?
If you, like, scoop up all the AI use that we're doing, like, what do we know about
that?
Right.
It's interesting.
Like, on an individual level, certainly like the texting and image generation stuff,
they're not that crazy energy intensive.
But that doesn't leave AI off the hook because when you do zoom out and to answer your
question. It really adds up fast because Open AI says it receives two and a half billion prompts
per day from people around the world. Wow. And AI in general is getting integrated into all
these institutions, which I think a lot of us are noticing. In fact, one survey of a variety of
organizations around the world found that 78% of them are now using AI to some extent. That is a lot.
And so nerds have looked at how much electricity is going to data.
centers to see if AI has made an impact. And they saw that from 2014 to 2023, the electricity
consumption of data centers tripled. That's according to a report from the Lawrence Berkeley National
Laboratories. Oh, wow. So like in this, and that's the AI period. Like that's like sort of
that's like that yeah, basically the age of AI like taking off. Taking off. And the energy suck
is expected to keep sucking more and more. One analysis.
predicts that by 2008, AI data centers will use as much electricity as a quarter of U.S. households
use per year.
Wow.
Imagine adding 25% more households to the U.S. in 2028.
That's what the prediction is that these AI data centers are going to use up.
That doesn't sound good.
Well, I mean, Casey, who's a climate reporter, she was like, you know, it's not the electricity per se.
That's the problem here.
That's kind of the crucial thing that I like to bring up and really harp on is that, you know, if we had abundant solar and wind power and batteries, you know, we might be less concerned about some of this energy demand.
But the reality is that grids around the world are still largely relying on fossil fuels.
So it's not good.
Right now in the U.S., only 9% of the country's power comes from renewable sources.
It's still mostly fossil fuels that we use to power our electric grid.
A third of our energy comes from petroleum.
A third comes from natural gas, which is another fossil fuel.
Both are greenhouse gas emitters.
And coal?
Coal is in the mix, too.
It's 8%.
So just a lot of this energy is dirty.
Mm-hmm.
And, of course, there are other countries with cleaner energy grids than the U.S.
But more than half of the data centers for the world are here in the U.S.
Well, and you know, I feel like the headlines, some of the headlines I've seen around this,
Rose have been like related to nuclear energy because there were headlines a while back that one of these companies was going to reopen Three Mile Island, which is this nuclear plant that was shut down because of an accident. And so there was talk of like that being reopened and like, you know, really a lot of these companies being very interested in what's going on with nuclear. So it does make me wonder, could nuclear help if we can get that ramped up? I asked Casey about that. And she was like, the thing about nuclear reopening or building a
nuclear plant. It takes so long. The last nuclear plant that we built in the U.S. took 15
years to complete. Yeah. And companies are just not going to wait for that to happen. And they're
not. They're not waiting for it. I mean, look at XAI. They brought in gas burning generators
to run their data center in Tennessee. Right. Okay. Okay. Well, that sucks.
Yeah. So I reached out to XAI and I didn't hear back. I also contacted Google and Anthropic just to ask about all this stuff that we've been talking about. I didn't get answers from them by our deadline.
Open AI did get back to me. They mostly pointed me to stuff that's already publicly available, open letters and blog posts, that kind of thing, talking about their energy use and how they see that in the future. And basically what Open AI is saying is that they want to work with the government.
to add capacity to the grid.
And they say that they want that energy
to come from all kinds of sources, including renewables.
Uh-huh. Okay.
And just overall, I will say,
there might be some changes coming for the positive.
So the energy that AI requires
to answer your query or make your image or your video,
that could be going down
because a lot of the tech companies
are trying to make their models more efficient.
When were they're doing that
is by turning off some of the parameters
that we talked about earlier
when they don't necessarily need them
to answer a particular question
or do a task.
So that's like shrinking
the switchboard essentially as needed.
So there is some evidence
that like the tech companies
are like trying to adjust
to make this thing.
Yes.
It might get better.
Okay.
So that's energy rose,
but I know there's another piece to this.
Yeah.
Mm-hmm.
What about water?
What is going on with water?
Right.
So we're going to talk about that
after the break.
Get no frills delivered.
Shop the same in-store prices online
and enjoy unlimited delivery with PC Express Pass.
Get your first year for $2.50 a month.
Learn more at pceexpress.ca.
Welcome back.
I'm Rose Rimler.
I'm here with Blythe-Torrell.
Hello.
So let's talk about water and AI.
And if you want to talk about water and AI, you call up Shaulay Ren.
He is a professor at UC Riverside.
He's actually in the computer engineering department, but he focuses on sustainability.
Most people in his field look at, you know, energy, greenhouse gases like we were just talking about.
But Shale has kind of forged his own path because he's thought about conserving water for just a lot of his life.
I spent my first few years in a small town back in China.
we just had access to fresh water, drinking water for half an hour each day.
During those half an hour, we had to use a big bucket to collect the water and use it for the rest of the day.
So in my memory, I never thought water is something unlimited.
It's just a finite resource.
You're never taken it for granted.
Right.
And so one reason AI uses a lot of water is something that you've probably heard.
heard before. The data centers get really hot because they're running all these fancy chips
doing all this computation like we were talking about earlier. And so these buildings,
they often use a cooling tower that uses water to cool everything down. Just like our human
bodies, we sweat and we feel cooler. For data center, if you use water in vaporation,
you can take away the heat very naturally, very efficiently. And where do they get that water
from? Most typically is from the municipal water infrastructure.
system. So the same as where if I lived there, if I would have turned my type on. Yeah. Yeah.
So they get water for where everyone gets water from the faucet, basically. And the reason for that
is they want, like, clean filtered water because if there was salt or minerals or like gunk in it,
then it could gum up this system, basically. Mm, okay. So as the water cools, the data centers,
it, you know, it evaporates away. It evaporates. And if I remember my like kindergarten, you know,
the water cycle, when water evaporates, it eventually comes back as rain, right?
So why do we need to worry about this?
So the evaporated water, yeah, it still stays within our global water cycle system.
It doesn't go away from the earth.
But still, when the water will be coming back and where it will be coming back, that's highly uncertain.
And it's very unevenly distributed across the globe.
So due to the long-term climate change, we're seeing more and more uneven distribution of the water resources.
So essentially, the wetter regions are getting wetter and drier regions are getting drier.
So even if the water is evaporated in, say, Arizona, that doesn't mean it'll come back as rain in Arizona, at least not any time soon?
Correct.
Okay.
Okay.
So the argument is it's using a bunch of water.
It's drawing it out from where everyone else is getting their water.
And it's not necessarily going to be replenished that easily.
Well, yeah, yeah.
It's going to evaporate the drinking water in Tucson.
And that water might next show up as a flood in Shanghai.
Right. Okay.
So let's talk about how much water is actually getting used here.
Shaulay and his team, they went down this rabbit hole fairly recently.
And they published a paper that kind of went viral.
In fact, a lot of people turn their results into a meme, basically it's saying that every time you use AI, they'll say in different ways.
Like, every time you chat with chat, JPEC, every time you write an e-bell with AI, you're consuming a bottle of water.
Have you seen this, Blythe?
Yes, yes.
This was one of the memes I first saw and shared without evidence.
I've seen videos of people filming themselves, like with a nice, beautiful, fresh bottle of water from the
store, opening it up and like pouring it down the drain and saying, like, this is what you're
doing when you use AI or someone will be like dressed up and pretending to be AI, like dress
as a robot and they're just like guzzling water. But that's not quite accurate.
Yeah. So that's a, yeah, that's a distortion of the message that we show in the paper.
A distortion.
Here's what they actually found. So they found that if you have a back and forth conversation
with, in this case, the model they looked at was chat GPT3.
It's a slightly older model, but if you have a back and forth with chat, JBT, 3, medium-length
messages, if you go back and forth for, on average, about 30 times, that uses a, essentially
the volume of a bottle of water, a half-liter of water.
Okay.
So it's like a, it's a decent conversation that gets you to that half-liter.
Yeah, and that's where the meme comes from.
So it's not super duper wrong, but what they're getting wrong or misunderstanding is,
is that the fresh drinking water that's used to cool the data center,
that's actually only a small part of this calculation.
So out of this half-liter of water that we're talking about,
only about 12% of it is drinking water that's used directly by the data center for cooling.
Oh.
And the rest of it is non-potable water from elsewhere.
It's drawn out of rivers, lakes, whatever.
It's used in the process of making electricity.
So that brings us back again to the power plants, you know, that old chestnut.
Okay, but wait.
So it's talking, so some of this is drinking water, but some of this is like...
But most of it is not.
But most of it's not.
But, I mean, but still, like, that's water in the environment could eventually become drinking water, right?
So, like, why does that distinction actually really matter?
Well, if you think the data center moving into your town is a threat because it's going to turn on a bigger tap than yours,
that's not quite right.
And I asked Shelley about that.
Do you think it's possible that a town will accept a data center
and it uses up all the town's water, essentially?
Like you live next to a data center,
you turn your tap and no water comes out?
I think in certain towns, it could be possible.
But in most of the towns,
I think the U.S. infrastructure tends to be,
at least for the water infrastructure,
is they should be able to have the capacity available for data centers.
He said that the biggest problems here might be likely to happen in really small towns
with really older, limited water infrastructure.
Okay.
So when I see people talking about how data centers are using up water,
I think, like, we might be ignoring the bigger issue here,
which is the water used by power plants.
And by the way, if we had more wind and solar on the grid,
the water use would go down.
Uh-huh.
But anyway, as of right now, overall,
taking into account the water used by power plants
and the water use for cooling,
we know that data centers consume 0.3%
of the nation's water supply.
I asked Shale about this.
I don't know what to make of that.
Is that a lot or is that a little? 0.3%.
So it's roughly the same amount of total public water supply
in Rhode Island.
So whether this point 3% is high or not, I would say it's modest.
It's not that much.
Brings up the question, should we be letting Rhode Island use all that water?
I mean, what has Rhode Island done for anyone else lately, you know?
It's, you know, finally the podcast is getting around to that question, which I've also had for years.
What is the point of Rhode Island?
Yeah, I mean, and the water used for the data centers for power generation and cooling is projected to go up.
It's actually expected to double in the next few years.
But ultimately, Chalet and another expert I spoke to,
said that whether or not this becomes a problem is a regional question.
It makes more sense to be granular about this.
Like, is the water being taken from an area that doesn't have the capacity?
You just can't paint with a broad brush here.
So complicated, I guess.
is where we so often land.
Okay, so taking all this together, Rose, where do you land?
Like, how evil is AI when it comes to the environment?
I asked all of our guests basically that same question.
I kind of put it in terms of, like, well, do you personally use AI, knowing about all these
environmental impacts?
Because these are people, all these people care a lot about the environment and these
issues. And all of them, Casey, Charlet, James, they all said that, yes, they do still use AI.
I'm awful at planning trips. So asking for an itinerary for going on a road trip or something,
I found that that's really helpful. I use it to polish my text writing to help me answer some
questions. And also my students use AI to generate paper summaries. So I've used AI for technical
things, like how to do certain repairs on my bike.
But I've also used it for seeing what people have said on a certain topic, like hikes in New England with the best views.
But everybody agreed that we should be thoughtful about how we use it, given this energy and water requirement as well.
So it's annoying because part of me is like, you know, the companies that make this and that are using this, like they're and that are like using it for their products and services that I'm using.
Like they're the ones who I want to think about their.
AI use, right? I want them to be thinking about whether they really need to use this or not. And I want them to be thinking about that in the context of energy use, water use, climate change, right? Like, that's my dream. Yes, it's on the companies. It's on the government. I mean, I think that the, my takeaway here is that, like, I'm not sure AI is the villain. I think the villain is our reprehensible and baffling inability.
to switch to renewable energy
and to put any kind of real effort into getting off of fucking fossil fuels.
Right.
It's the same enemy we've been fighting for 50 years or whatever.
Right, right.
Also, I think that one reason AI is getting people riled up
as opposed to, like, those old climate offenders flying, eating meat, you know, that kind of thing,
is people see the value in the trade-off of the environmental impact of something like taking a flight.
or eating a burger.
There's an obvious benefit to those things.
With AI, yes, some people have found it really useful,
but a lot of people haven't,
and they just don't think it has much value at all.
In fact, one survey found that 61% of people in the U.S.
think that AI has more drawbacks than it has benefits.
Okay, so more than half of us are just like, overall.
We ain't this shit.
Yeah.
No, thank you.
Exactly.
Okay.
I think that's one reason AI is our current villain,
when, in fact, I think the villain is, I think that's like a nostril on the larger villain,
which is the evil monster that is keeping us glued to fossil fuels.
Right, the nostril.
Okay, I appreciate that picture.
Okay, so I do want to know one last thing, though.
Has learning this and digging into all of this AI and energy and water stuff, has it changed?
how you use AI?
Yeah, a little bit.
Also, I think the novelty is wearing off a bit.
And I was never, like, using it a ton.
But, mm-hmm.
I don't know.
I was asking people about, like,
what's some stupid stuff that you've seen generated by AI?
And you're like, oh my God, that wasn't worth the energy.
And I thought of my own playing around with it.
And I was like, remember that time I had AI generate an image
of my boyfriend cuddling with my cat because my cat doesn't like him so I was like
oh this is what it would be like if you guys got along you know and I sent it to him and I was
like I don't think I would do that again I don't think that was worth the energy so it has
changed a little bit how you'd make that value assessment kind of is this is using is using
AI for this thing going to like add value is it actually really useful for this or or could I
just glued salmon to his fingers, and then the cat would actually maybe come over.
Yeah, you know, yes, Rose, let's go back to the...
Let's go back to basics.
Let's go back to the basics of gluing salmon to our boyfriend's fingers to get our cat to like him.
Yep.
That's science versus.
Thanks, fly.
Thanks, Rose.
Oh, and while we're here, how many citations are in this week's episode?
There are 66 citations.
Mm-hmm.
Where can people?
find them? They can find them in our transcript. The link to the transcript is in our show notes.
Also in our show notes, we'll put a link to the article that James and Casey wrote for MIT
Technology Review. It's really good. People should go read it. And people should also check out
our Instagram. We've got some interesting stuff there, maybe even a little Jeff Goldblum content
for you. Give the people what they want. Exactly. Great. Love it.
This episode was produced by Rose Rimler and Blythe Terrell
with help from Merrill Horn and Michelle Deng.
We're edited by Blythe Thorell.
Fact-checking by Diane Kelly.
Mix and sound design by Bobby Lord.
Music written by Emma Munger, So Wiley, Peter Leonard, Bumi Hedaca, and Bobby Lord.
Thanks to all the researchers we reached out to, including Professor Melissa Scanlan,
and special thanks to Andrew Pooleya and Jesse Rimler.
Science Verses is a Spotify studio's original.
Listen for free on Spotify or wherever you get your podcasts.
Follow us and tap the bell for new episode notifications.
We'll fact you soon.
