- AI’s rapid growth is driving increasing energy consumption
- Telcos and hyperscalers will add significantly to this massive surge in power demand and emissions
- Current solutions, like efficient GPUs and renewable energy, are not enough to meet the growing power demands of AI
Artificial intelligence (AI) is the tech world’s golden child, promising everything from curing diseases to revolutionizing our global networks. But beneath the hype lies a nagging problem: the energy required to power AI's ascension is growing faster than solutions to manage it. Telcos will need to come up with a better fix.
AI doesn’t just think fast—it eats electricity for breakfast. With the global AI market expected to grow at a breakneck pace, that hunger will only increase. Data centers already consume about 1% of the world’s energy and that figure is expected to skyrocket as AI use scales.
With power consumption comes emissions. Today, training a single large AI model can emit as much carbon as five cars over their lifetimes.
So as the tech industry talks up AI sustainability, it’s worth asking if their efforts to execute are inadequate for the scale of the problem.
Sure, progress is happening. GPUs, the backbone of AI computing, are becoming more energy efficient. Nvidia, for instance, has touted making its chips better at delivering performance per watt. But what does that mean when the demand for these chips is growing exponentially? It’s like making cars slightly less polluting while simultaneously putting millions more on the road. The net result isn’t exactly climate-friendly.
The truth is, many of the industry's sustainable solutions feel like we're trying to bail out a sinking ship with a teaspoon.
Telcos sit at the heart of the AI power demand issue, acting as both enablers of AI-driven innovation and significant energy consumers themselves. As they increasingly rely on public cloud providers like AWS, Google Cloud and Microsoft Azure for AI deployments and run more of their own on-prem data centers for AI, telcos also inherit the cloud industry's substantial energy demands. They often tout their sustainability goals without addressing the AI-specific challenges posed by these hyperscaler partnerships.
Many telcos have committed to reducing their carbon footprints, with operators like AT&T and Deutsche Telekom adopting renewable energy strategies for their data centers and networks. Buying carbon offset credits is another common corporate sustainability tool, but shuffling emissions around like Monopoly money doesn't exactly seem like a breakthrough. The climate isn’t impressed by those receipts. Maybe a bigger chunk of the industry's R&D budget might help?
Telcos are also integrating AI for network optimization, leveraging intelligent routing and predictive maintenance to minimize energy use across their sprawling infrastructure. But these steps, while meaningful, mostly focus on improving efficiency rather than addressing the root cause of their growing energy appetite.
It's a bit like comedian Marc Maron's sarcastic take on consumer environmentalism: we're not more upset about the looming environmental apocalypse because we somehow believe that we're doing everything we can to stop it. "Think about it," Maron's bit goes, "we brought our own bags to the supermarket... Yeah, that's about it… And it just wasn't enough, it turns out.
"But maybe the 'no straw thing' will help."
Even as the environmental movement has seen bigger wins—electric vehicles, more plant-based diets and global agreements like the Paris Accord—emissions aren’t falling fast enough. Adding AI’s power demands to the equation makes the challenge even steeper.
Some are betting on nuclear energy as the savior of the AI age. It’s an intriguing idea—clean, high-capacity and theoretically capable of meeting the power demands of massive AI workloads. But nuclear power has its own hurdles: regulatory red tape, public skepticism and timelines that stretch into decades. We are already deep into climate catastrophe; by the time nuclear power becomes viable, it may be too late.
Other solutions? Renewable energy for data centers, like Google’s efforts to power their operations entirely with wind and solar, are commendable but limited by geography and infrastructure. Some startups are experimenting with more radical ideas, like liquid cooling for servers to reduce energy waste.
Still, these aren’t breakthroughs that will close the gap—they’re tweaks at the margins.
The conversation around AI sustainability feels eerily familiar: a series of small victories that don’t address the systemic problems. I’m no engineer, and I know it’s easier said than done, but if AI is to fulfill its promises without destroying the planet, we need solutions that match its scale of impact. Paper straws and tote bags won’t cut it.
Op-eds from industry experts, analysts or our editorial staff are opinion pieces that do not represent the opinions of Fierce Network.