Consider This from NPR - AI and the Environment
Episode Date: March 30, 2025The AI boom has caused a huge surge in energy consumption, so how is the tech industry thinking about its environmental footprint as it invests in new AI models? Emily Kwong, host and reporter for NPR...'s Short Wave podcast, finds out what solutions are being considered that might meet both consumer demand and address climate concerns.For sponsor-free episodes of Consider This, sign up for Consider This+ via Apple Podcasts or at plus.npr.org.Email us at considerthis@npr.org.Learn more about sponsor message choices: podcastchoices.com/adchoicesNPR Privacy Policy
Transcript
Discussion (0)
In 2018, Sasha Lucioni started a new job, AI researcher for Morgan Stanley.
She was excited to learn something new in the field of AI, but she couldn't shake this worry.
I essentially was getting more and more climate anxiety. I was really feeling this profound
disconnect between my job and my values and the things that I cared about. And so essentially,
I was like, oh, I should quit my job and go plant trees.
I should, you know, I should do something that's really making a difference in the world.
And then my partner was like, well, you have a PhD in AI.
Maybe you can use that to make a difference in the world.
So Luciani quit her job and joined a growing movement to make AI more sustainable.
Since 2022, AI has boomed and it's caused a surge in energy consumption.
Tech companies are racing to build data centers to keep up these huge buildings filled with
hundreds of thousands of computers that require a lot of energy.
By 2028, Lawrence Berkeley National Laboratory forecasts the data centers could consume as
much as 12 percent of the nation's electricity.
And AI is also leading a surge in water consumption.
It's a concern echoed all over social media.
The amount of water that AI uses is astonishing.
AI needs water.
People are saying that every time you use Chagy BT...
Chagy BT uses this much water for a hundred-word email.
So where will that water come from?
And the four big data center operators with a growing water and carbon footprint are Google,
Microsoft, Amazon, and Meta.
And to be clear, all four of those are among NPR's financial supporters and pay to distribute
some of our content.
Before generative AI came along in late 2022, there was hope among these data center operators
that they could go to net zero.
Benjamin Lee studies computer architecture at the University of Pennsylvania.
Generative AI refers to the AI that uses large language models.
So I don't see how you can under current infrastructure investment plans,
you could possibly achieve those net zero goals.
And data center construction is only going to increase.
On January 21st, the day after his second inauguration, President Trump announced a
private joint venture to build 20 large data centers across the country, as heard here
on NBC.
A new American company that will invest $500 billion at least in AI infrastructure in the
United States and very quickly moving very rapidly.
This new project, known as Stargate, would together consume 15 gigawatts of power.
That would be like 15 new Philadelphia-sized cities consuming energy.
Consider this.
As much as big tech says they want to get to net zero, there are no regulations forcing
them to do so. So how is the industry thinking about its future
and its environmental footprint?
From NPR, I'm Emily Kwong.
This message comes from WISE,
the app for doing things in other currencies.
Sending or spending money abroad,
hidden fees may
be taking a cut.
With WISE, you can convert between up to 40 currencies at the mid-market exchange rate.
Visit WISE.com.
TNCs apply.
This message comes from Mint Mobile.
Mint Mobile took what's wrong with wireless and made it right.
They offer premium wireless plans for less, and all plans include high-speed data, unlimited
talk and text, and nationwide coverage.
See for yourself at MintMobile.com slash Switch.
This message comes from MintMobile.
MintMobile took what's wrong with wireless and made it right.
They offer premium wireless plans for less, and all plans include high-speed data, unlimited talk and text, and nationwide coverage.
See for yourself at MintMobile.com slash switch.
Let's consider this from NPR. Okay, so the four cloud giants, Google, Meta, Microsoft,
and Amazon, all have climate goals, goals for hitting net zero carbon emissions,
most by 2030, Amazon by 2040.
And there's a few ways they can get there.
Let's start with a very popular energy source for big tech, nuclear.
Because Amazon, Meta, and Alphabet, which runs Google, just signed an agreement, along
with other companies, that supports tripling the global nuclear supply by 2050. And along with
Microsoft, these four companies have signed agreements to purchase nuclear energy, an
industry that has been stagnant for years.
Microsoft has committed to buying power from an old nuclear plant on Three Mile Island
in Pennsylvania. You may remember that was the site of a partial nuclear meltdown in 1979,
and NPR's Nina Totenberg talked to kids in the Harrisburg area right after.
Do you know what evacuation is?
That everybody has to go.
Do you know why?
Because of radioactivity.
While some radioactive gas was released, thankfully, it wasn't enough to cause serious health
effects. And Microsoft now wants to build this nuclear site back.
In a way, AI companies are turning into energy brokers.
But my science desk colleague, Jeff Brumfield, sees a discrepancy in this between the AI
people and the nuclear energy people.
These are just two super different engineering cultures.
You know, and the way I've come to think about it is Silicon Valley loves to go fast
and break things. The nuclear industry has to move very, very, very slowly
because nothing can ever break.
Because of accidents like Three Mile Island, Jeff says that nothing in the nuclear industry
ever happens quickly. It's also extremely expensive. And while solar and wind energy
combined with batteries is quicker to build and more inexpensive
than nuclear or gas power plants, it still takes time to build and there are problems
hooking up new energy sources to the grid.
So in the meantime, many data centers will continue to use fossil fuels.
But there's another solution here, and that's to make data centers themselves more efficient
through better hardware, better chips, and more efficient
cooling systems.
One of the most innovative methods on the rise is liquid cooling.
Basically, running a synthetic fluid through the hottest parts of the server to take the
heat away, or immersing whole servers in a cool bath.
It's the same idea as running coolant through your car engine, and a much faster way to
cool off a hot computer.
Here's Benjamin Lee again at UPenn.
And as you can imagine, it's much more efficient
because now you're just cooling the surface
of whatever the cold plate is covering
rather than just blowing air through the entire machine.
One of the biggest providers of liquid cooling is Isotope.
David Craig is their recently retired CEO
and based in the UK.
I definitely come from the point of view
that we literally have just one planet and I cannot
understand why anybody would want to do anything other than care for it.
Craig says that the older way of cooling data centers, basically there's lots of methods
but it's a daisy chain of moving heat with air and water, is consumptive.
With liquid cooling, a lot of the heat stays in the system, and
computers don't have these massive swings in temperature.
It's not got a constant thermal shock. It's got less vibration from fans and stuff like
that, so things last longer. And then what we're doing is we're capturing that heat
in a closed water loop.
Liquid cooling, however, is expensive, which makes it hard to scale.
But Isotope has announced public partnerships with Hewlett-Packard and Intel, and a spokesperson
at Meta told me they anticipate some of the company's liquid cooling enabled data centers
will be up and running by 2026.
Throughout my many emails and seven hours of phone conversations with spokespersons
at Amazon, Google, and Microsoft too,
there was one innovation they were kind of quiet about.
And it's the one that scientists and engineers
outside of big tech were most excited about.
And that is smaller AI models.
One's good enough to complete a lot of the tasks
we care about, but in a much less energy intensive way.
Basically, a third and final solution
to AI's climate problem is using less AI.
One major disruptor in this space is DeepSeq,
the chatbot out of a company in China
claiming to use less energy.
We reached out to them for comment,
but they did not reply.
You see, large language models like ChatGPT are often trained using
large datasets, say by feeding the model over a million hours of YouTube content. But DeepSeq
was trained by data from other language models. Benjamin Li at UPenn says this is called a mixture
of experts. The whole idea behind a mixture of experts is you don't need a single huge model with
a trillion parameters to answer every possible question under the sun.
Rather, you would like to have a collection of experts, smaller models, and then you just
sort of route the request to the right expert.
And because each expert is so much smaller, it's going to cost less energy to invoke.
Even though DeepSeq was trained more efficiently this way, other scientists I spoke to pointed
out it's still a big model, and Sasha Lucioni at Hugging Face wants to walk away from those
entirely.
Since Josh Abiki came out, people were like, oh, we want general purpose models.
We want models that can do everything at once, answer questions, write recipes, and poetry,
and whatever.
But nowadays, more and
more I think companies especially are like, well actually for our intents and purposes,
we want to do X like whatever, summarize PDFs.
What Zosha is talking about are small language models, which have far fewer parameters and
are trained for a specific task. And some tech companies are experimenting with this.
Last year, Meta announced a smaller quantized version of some of their models.
Microsoft announced a family of small models called Phi 3.
A spokesperson for Amazon said they're open
to considering a number of models
that can meet their customers' needs.
And a spokesperson for Google said they did not have
a comment about small language models at this time.
So meanwhile, the race to build infrastructure
for large language models is this time. So meanwhile, the race to build infrastructure for large
language models is very much underway. Here's Kevin Miller, who runs global infrastructure
at Amazon Web Services.
I think you have to look at the world around us and say, we're moving towards a more
digital economy overall. And that is ultimately kind of the biggest driver for the need for
data centers and cloud computing.
If that is the level of computing we're headed for, Luciani has one last idea,
an industry-wide score for AI models. Just like Energy Star became a widely recognized program
for ranking the energy efficiency of appliances. She says that tech companies, however, are far
from embracing something similar. So we're having a lot of trouble getting buy-in from companies.
There's like such a blanket ban on any kind of transparency
because it could either like make you look bad,
open you up for whatever legal action,
or just kind of give people a sneak peek behind the curtain.
So as a science reporter for NPR, my main question is,
do we really need all of this computing power
when we know
it could imperil climate goals? And David Craig, the recently retired CEO of Isotope,
chuckled when I asked this. He said, Emily, you know, human nature is against us.
We are always that kid who does touch the very hot ring on the cooker when our mum said don't.
You know, we are always the people who touch the wet paint,
sign, and stuff, right?
That's human beings.
And the truth is with data, this stuff
has just grown up in the background.
People just haven't known about it.
But here's something I think we can all think about.
The AI revolution is still fairly new.
Google CEO Sundar Pichai compared AI
to the discovery of electricity.
Except unlike the people during the Industrial Revolution,
we know AI has a big climate cost.
And there's still time to adjust how and how much of it we use.
This episode was produced by Avery Keatley and Megan Lim, with audio engineering by Ted Nibane.
It was edited by Adam Rainey, Sarah Robbins, and Rebecca Ramirez.
Our executive producer is Sammy Yannigan.
It's Consider This from NPR. I'm Emily Kwong. You can hear more science reporting
like this on the science podcast I co-host every week, ShoreWave. Check it out.
This message comes from Mint Mobile. If you're tired of spending hundreds on big wireless
bills, bogus fees, and free perks, Mint Mobile might be right for you with plans starting
from 15 bucks a month. Shop plans today at mintmobile.com slash switch.
Upfront payment of $45 for 3-month 5GB plan required.
New customer offer for first 3 months only.
Then full price plan options available.
Taxes and fees extra. See Mint Mobile for details.