The Internet’s Next Great Power Suck

AI’s carbon emissions are about to be a problem.

A footprint filled with binary code surrounded by greenery
Illustration by The Atlantic. Source: Getty.

In Facebook’s youth, most of the website was powered out of a single building in Prineville, Oregon. That data center, holding row upon row of refrigerator-size racks of servers filled with rows of silicon chips, consumed huge amounts of electricity, outstripping the yearly power usage of more than 6,000 American homes. One day in the summer of 2011, as reported in The Register, a Facebook exec received an alarming call: “There’s a cloud in the data center … inside.” Following an equipment malfunction, the building had become so hot and humid from all the electricity that actual rain, from a literal cloud, briefly drenched the digital one.

Now Facebook, or rather Meta, operates well more than a dozen data centers, each much bigger and more powerful than the one in Prineville used to be. Data centers have become the backbone of the internet, running Amazon promotions, TikTok videos, Google search results, and just about everything else online. The thousands of these buildings across the world run on a shocking amount of electricity—akin to the power usage of England—that is in part, if not mostly, generated by fossil fuels. While the internet accounts for just a sliver of global emissions, 4 percent at most, its footprint has steadily grown as more people have connected to the web and as the web itself has become more complex: streaming, social-media feeds, targeted ads, and more.

All of that was before the generative-AI boom. Compared with many other things we use online, ChatGPT and its brethren are unique in their power usage. AI risks making every search, scroll, click, and purchase a bit more energy intensive as Silicon Valley rushes to stuff the technology into search engines, photo-editing software, shopping and financial and writing and customer-service assistants, and just about every other digital crevice. Compounded over nearly 5 billion internet users, the toll on the climate could be enormous. “Within the near future, at least the next five years, we will see a big increase in the carbon footprint of AI,” Shaolei Ren, a computer scientist at UC Riverside, told me. Not all of the 13 experts I spoke with agreed that AI poses a major problem for the planet, but even a moderate emissions bump could be destructive. With so many of the biggest sources of emissions finally slowing as governments crack down on fossil fuels, the internet was already moving in the wrong direction. Now AI threatens to push the web’s emissions to a tipping point.

That hasn’t quite happened yet, as far as anyone can tell. Almost no data are available for how much carbon popular models such as ChatGPT emit (a spokesperson for OpenAI declined to comment for this article). The emissions from AI are hard to calculate, depending on the computing power used in a data center, the amount of electricity it requires, and how that electricity is generated. Some signs suggest that electricity usage is already ticking upward during the AI boom. Water usage is a rough proxy for electricity demand, because data centers use water to stay cool, and their water usage across the globe is increasing quickly; Google’s on-site water use rose roughly 20 percent in 2022, Ren said, driven in part by investments in AI that are only growing.

Generative AI produces emissions in three ways. First, carbon is burned to build the computer chips and data centers that AI runs on. Second, training a large language or other AI model requires power. Training a system like ChatGPT, for instance, can produce carbon emissions equivalent to those of several, if not several dozen, U.S. homes in a year, Jesse Dodge, a research scientist at the Allen Institute for AI, told me. Third, the chatbot or any other end product requires electricity every time it is used. A language model from Hugging Face emitted about 42 pounds of carbon a day during an 18-day stretch in which it received 558 requests an hour, for a total equivalent to driving about 900 miles.

That might seem small, but those numbers could compound quickly as many billions of dollars continue pouring into generative AI. These programs are getting larger and more complex, with training datasets ballooning exponentially and models doubling in size as frequently as every three months. New models are constantly released, old ones frequently retrained. Even if a single chatbot message uses a tiny amount of energy, “we want to chat with anything and everything, and so these unit costs are going to really add up,” Sasha Luccioni, a research scientist at Hugging Face who studies AI and sustainability, told me. As generative AI begins to fully saturate the web, deployment of bots could account for three-fifths of the technology’s emissions, if not far more.

Consider Google Search, which is already in the process of getting chatbot functionality. Google receives an average of 150 million search queries an hour, and each AI-powered search result might require five to 10 times as much computing power as a traditional one, Karin Verspoor, the dean of the School of Computing Technologies at RMIT University, in Australia, told me. Data centers are already seeing their power consumption jump due to AI, and McKinsey predicts that data centers’ electricity use will more than double by 2030. Exactly how much of an emissions bump this would be is unclear, but “the bottom line is we have more people doing more sophisticated things on the internet, and that is going to lead to a significant increase in the overall energy,” Vijay Gadepally, a computer scientist at MIT’s Lincoln Laboratory, told me.

That the chatbots will be a carbon bomb is far from guaranteed. Even without generative AI, global internet traffic has expanded 25-fold since 2010, but electricity use has climbed more slowly because of improvements in the efficiency of data centers, computer chips, and software. Data centers are asked to do more and more, but “the efficiency of how we produce the computing also goes up pretty fast,” Jonathan Koomey, a former researcher at Stanford who is an expert on the environment and digital technology, told me. While Google has expanded its machine-learning research in recent years, its electricity use has not outpaced the rest of the company’s, according to research from David Patterson, an emeritus professor of computer science at UC Berkeley. Some efficiency improvements will simply be economically necessary to turn a profit. OpenAI CEO Sam Altman has described the computing costs of ChatGPT as “eye-watering.” On its current path, AI could burn itself out before it burns up the planet.

In other words, although generative AI will require more computation, it may not proportionally increase electricity demand. Nor is rising power usage guaranteed to increase emissions as the world turns to renewable energy, Mark Dyson, the managing director of the carbon-free-electricity program at the think tank RMI, told me. Spokespeople at Meta, Google, and Microsoft all pointed me to the investments they are making in renewable energy and reduced power and water use at their data centers as part of ambitious emissions-reduction targets. But those improvements could take years, and the generative-AI boom has already started. The need for data centers running AI to have a lot of power at all times could lead them to stick with at least some, if not substantial, fossil-fuel sources, Luccioni said. You can easily burn more coal or natural gas when needed, but you can’t make the wind blow harder.

Even if all of these efficiency improvements continue—in hardware, software, and the grid—they may not entirely cancel out the growing computational intensity of AI, Luccioni said, a phenomenon sometimes known as the rebound effect. When technology grows more efficient, the extra resources fuel more demand. More efficient coal-burning in the 19th century only accelerated industrialization, resulting in more factories running on coal; wider highways don’t ease congestion but lead more people to drive and can create even more traffic. Data centers and AI programs that use less electricity might just allow tech companies to cram generative AI into more websites and software. Silicon Valley’s business model, after all, relies on getting people to spend as much time as possible on websites and apps. A chatbot that emits less carbon per message, multiplied over exponentially more messages, would still increase emissions.

The carbon footprint of generative AI doesn’t need to grow exponentially to threaten the planet. Meeting our ambitious climate targets will require decreasing emissions across every sector, and AI makes it much harder to stabilize, let alone shrink, the internet’s share. Even if the tonnage of carbon the internet pumps into the atmosphere didn’t budge for decades—an improbably optimistic scenario—and everything else in the world reduced its emissions enough to stop warming at 1.5 degrees Celsius, as is the goal of the Paris agreement, that would still be “nowhere near enough” to meet the target, as one 2020 opinion paper in the journal Patterns put it. As AI and other digital tools help other sectors become greener—improving the efficiency of the grid, enhancing renewable-energy design, optimizing flight routes—the internet’s emissions may continue creeping up. “If we’re using AI, and AI is being sold as pro-environment, we’re going to increase our use of AI throughout all sectors,” Gabrielle Samuel, a lecturer in environmental justice and health at King’s College London, told me.

Perhaps the most troubling aspect of AI’s carbon footprint is that, because the internet’s emissions have always been relatively small, almost no one is prepared to deal with them. The Inflation Reduction Act, the historic climate law Congress passed last year, doesn’t mention the web; activists don’t chain themselves to data centers; we don’t teach children to limit their search queries or chatbot conversations for the sake of future generations. With so little research or attention given to the issue, it’s not clear that anybody should. Ideally AI, like coal-fired power plants and combustion-engine cars, would face the economic and regulatory pressure to become emissions-free. Similar to how the EPA sets emissions requirements for new vehicles, the government could create ratings or impose standards for AI model efficiency and the industry’s use of renewable-energy sources, Luccioni said. If a user asks Google to decide whether a photo is of a cat or a dog, a less energy-intensive model that is 96 percent accurate, instead of 98 percent, might suffice, Devesh Tiwari, an engineer at Northeastern University, has shown. And does the world really need AI-powered beer brewing?

The internet can appear untethered from the physical world: digital and virtual, two-dimensional, in cyberspace instead of material space. A chatbot is not visibly plugged into a smokestack belching gray plumes, does not secrete the acrid smell of gasoline from an exhaust pipe. But the data centers and computer chips it connects to, and the electricity and carbon they generate, are of our world—and our problem.

Matteo Wong is an associate editor at The Atlantic.