© 2025 WLRH All Rights Reserved
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

How is the tech industry thinking about AI's environmental impact?

EMILY KWONG, HOST:

Perhaps you have played around with artificial intelligence or have seen AI woven into your favorite apps. AI infrastructure is a big priority under the Trump administration, and tech companies are investing heavily. It's an exciting field, one that captured Sasha Luccioni. In 2018, she started a new job - AI researcher at Morgan Stanley. She was excited to learn something new in the field of AI but couldn't shake this worry.

SASHA LUCCIONI: I, essentially, was getting more and more climate anxiety. I was really feeling this profound disconnect between my job and my values and the things that I cared about. And so, essentially, I was like, oh, I should quit my job and go plant trees. I should - you know, I should do something that's really making a difference in the world. And then my partner was like, well, you have a Ph.D. in AI. Maybe you can use that to make a difference in the world.

KWONG: So Luccioni quit her job and joined a growing movement to make AI more sustainable. Since 2022, AI has boomed, and it's caused a surge in energy consumption. Tech companies are racing to build data centers to keep up - these huge buildings filled with hundreds of thousands of computers that require a lot of energy. By 2028, Lawrence Berkeley National Laboratory forecasts the data centers could consume as much as 12% of the nation's electricity. And AI is also leading a surge in water consumption. It's a concern echoed all over social media.

(SOUNDBITE OF MONTAGE)

UNIDENTIFIED PERSON #1: The amount of water that AI uses is astonishing.

UNIDENTIFIED PERSON #2: AI needs water.

UNIDENTIFIED PERSON #3: A lot of people are saying that every time you use ChatGPT...

UNIDENTIFIED PERSON #4: ChatGPT uses this much water for 100-word emails.

UNIDENTIFIED PERSON #5: And where will that water come from?

KWONG: And the four big data center operators with a growing water and carbon footprint are Google, Microsoft, Amazon and Meta. And to be clear, all four of those are among NPR's financial supporters and pay to distribute some of our content.

BENJAMIN LEE: Before generative AI came along in late 2022, there was hope among these data center operators that they could go to net-zero.

KWONG: Benjamin Lee studies computer architecture at the University of Pennsylvania. Generative AI refers to the AI that uses large language models.

LEE: So I don't see how you can - under current infrastructure investment plans, you could possibly achieve those net-zero goals.

KWONG: And data center construction is only going to increase. On January 21, the day after his second inauguration, President Trump announced a private joint venture to build 20 large data centers across the country, as heard here on NBC.

(SOUNDBITE OF ARCHIVED RECORDING)

PRESIDENT DONALD TRUMP: A new American company that will invest $500 billion at least in AI infrastructure in the United States and very quickly, moving very rapidly.

KWONG: This new project, known as Stargate, would together consume 15 gigawatts of power. That would be like 15 new Philadelphia-sized cities consuming energy. So how is the industry thinking about its future and its environmental footprint? The four cloud giants - Google, Meta, Microsoft and Amazon - all have climate goals, goals for hitting net-zero carbon emissions - most by 2030, Amazon by 2040. And there's a few ways they can get there.

Let's start with a very popular energy source for big tech, nuclear, because Amazon, Meta and Alphabet, which runs Google, just signed an agreement, along with other companies, that supports tripling the global nuclear supply by 2050. And along with Microsoft, these four companies have signed agreements to purchase nuclear energy, an industry that has been stagnant for years. Microsoft has committed to buying power from an old nuclear plant on Three Mile Island in Pennsylvania. You may remember that was the site of a partial nuclear meltdown in 1979, and NPR's Nina Totenberg talked to kids in the Harrisburg area right after.

(SOUNDBITE OF ARCHIVED NPR BROADCAST)

NINA TOTENBERG: You know what evacuation is?

UNIDENTIFIED CHILD: That everybody has to go.

NINA TOTENBERG: Do you know why?

UNIDENTIFIED CHILD: Because of radioactivity.

KWONG: While some radioactive gas was released, thankfully it wasn't enough to cause serious health effects. And Microsoft now wants to build this nuclear site back. In a way, AI companies are turning into energy brokers. But my science desk colleague, Geoff Brumfiel, sees a discrepancy in this between the AI people and the nuclear energy people.

GEOFF BRUMFIEL, BYLINE: These are just two super different engineering cultures, you know? And the way I've come to think about it is Silicon Valley loves to go fast and break things. The nuclear industry has to move very, very, very slowly because nothing can ever break.

KWONG: Because of accidents like Three Mile Island, Geoff says that nothing in the nuclear industry ever happens quickly. It's also extremely expensive. And while solar and wind energy, combined with batteries, is quicker to build and more inexpensive than nuclear or gas-power plants, it still takes time to build, and there are problems hooking up new energy sources to the grid. So in the meantime, many data centers will continue to use fossil fuels.

But there's another solution here, and that's to make data centers themselves more efficient through better hardware, better chips and more efficient cooling systems. One of the most innovative methods on the rise is liquid cooling - basically, running a synthetic fluid through the hottest parts of the server to take the heat away or immersing whole servers in a cool bath. It's the same idea as running coolant through your car engine and a much faster way to cool off a hot computer. Here's Benjamin Lee again, at UPenn.

LEE: And as you can imagine, it's much more efficient because now you're just cooling the surface of whatever the cold plate is covering rather than just blowing air through the entire machine.

KWONG: One of the biggest providers of liquid cooling is Iceotope. David Craig is their recently retired CEO and based in the U.K.

DAVID CRAIG: I definitely come from the point of view that, you know, we literally have just one planet, and I cannot understand why anybody would want to do anything other than care for it.

KWONG: Craig says that the older way of cooling data centers - basically, there's lots of methods, but it's a daisy chain of moving heat with air and water - is consumptive. With liquid cooling, a lot of the heat stays in the system, and computers don't have these massive swings in temperature.

CRAIG: It's not got a constant thermal shock. It's got less vibration from fans and stuff like that, so things last longer. And then, what we're doing is we're capturing that heat in a closed water loop.

KWONG: Liquid cooling, however, is expensive, which makes it hard to scale. But Iceotope has announced public partnerships with Hewlett-Packard and Intel, and a spokesperson at Meta told me they anticipate some of the company's liquid cooling-enabled data centers will be up and running by 2026. Throughout my many emails and seven hours of phone conversations with spokespersons at Amazon, Google and Microsoft, too, there was one innovation they were quiet about, and it's the one that scientists and engineers outside of big tech were most excited about. And that is smaller AI models, ones good enough to complete a lot of the tasks we care about but in a much less energy-intensive way. Basically, a third and final solution to AI's climate problem is using less AI.

One major disruptor in this space is DeepSeek, the chatbot out of a company in China claiming to use less energy. We reached out to them for comment, but they did not reply. You see, large language models, like ChatGPT, are often trained using large datasets, say, by feeding the model over a million hours of YouTube content. But DeepSeek was trained by data from other language models. Benjamin Lee at UPenn says this is called a mixture of experts.

LEE: The whole idea behind a mixture of expert is, you don't need a single, huge model with a trillion parameters to answer every possible question under the sun. But rather, you would like to have a collection of experts, smaller models, and then you just sort of route the request to the right expert. And because each expert is so much smaller, it's going to cost less energy to invoke.

KWONG: Even though DeepSeek was trained more efficiently this way, other scientists I spoke to pointed out it's still a big model, and Sasha Luccioni at Hugging Face wants to walk away from those entirely.

LUCCIONI: Since ChatGPT came out, people were like, oh, we want general purpose models. We want models that can do everything at once - answer questions and write recipes and poetry and whatever. But nowadays, more and more, I think companies especially are like, well, actually, for our intents and purposes, we want to do X, like - whatever - summarize PDFs.

KWONG: What Sasha is talking about are small language models, which have far fewer parameters and are trained for a specific task. And some tech companies are experimenting with this. Last year, Meta announced a smaller quantized version of some of their models. Microsoft announced a family of small models called Phi-3. A spokesperson for Amazon said they're open to considering a number of models that can meet their customers' needs. And a spokesperson for Google said they did not have a comment about small language models at this time. So meanwhile, the race to build infrastructure for large language models is very much underway. Here's Kevin Miller, who runs global infrastructure at Amazon Web Services.

KEVIN MILLER: I think you have to look at the world around us and say we're moving towards a more digital economy overall, and that is ultimately kind of the biggest driver for the need for data centers and cloud computing.

KWONG: If that is the level of computing we're headed for, Luccioni has one last idea, an industry-wide score for AI models, just like Energy Star became a widely recognized program for ranking the energy efficiency of appliances. She says that tech companies, however, are far from embracing something similar.

LUCCIONI: So we're having a lot of trouble getting buy-in from companies. There's, like, such a blanket ban on any kind of transparency because it could either, like, make you look bad, open you up for whatever legal action or just kind of give people a sneak peek behind the curtain.

KWONG: So as a science reporter for NPR, my main question is - do we really need all of this computing power when we know it could imperil climate goals? And David Craig, the recently retired CEO of Iceotope, chuckled when I asked this. He said, Emily, you know, human nature is against us.

CRAIG: We are always that kid who does touch the very hot ring on the cooker when our mum said don't, you know? We are always the people who touch the wet paint sign and stuff, right? That's human beings. And the truth is, with data, you know, this stuff has just grown up in the background. People just haven't known about it.

KWONG: But here's something I think we can all think about. The AI revolution is still fairly new. Google CEO Sundar Pichai compared AI to the discovery of electricity. Except, unlike the people during the Industrial Revolution, we know AI has a big climate cost, and there's still time to adjust how and how much of it we use.

(SOUNDBITE OF BEASTIEBOYS' "B FOR MY NAME")

KWONG: You can hear more science reporting like this on the science podcast I cohost every week, Short Wave. Check it out.

(SOUNDBITE OF BEASTIEBOYS' "B FOR MY NAME") Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Emily Kwong
Emily Kwong (she/her) is the founding reporter and now co-host for Short Wave, NPR's daily science podcast. Her first homework assignment in kindergarten was to bring in a leaf to class. She's been looking at trees ever since.
Avery Keatley
[Copyright 2024 NPR]
Rebecca Ramirez
Megan Lim
[Copyright 2024 NPR]
Related Stories