
When you request an AI model to create a social media post or generate a Picasso-inspired animal graphic, there's a small but quantifiable energy cost associated with it, accompanied by emissions released into the atmosphere. While each individual query typically consumes less energy than briefly using a deep fryer, the effect might seem negligible. However, as more people rely on AI tools, these cumulative impacts become increasingly significant. According to Morgan Stanley, power demand from generative AI will increase at an annual average of 70% through 2027, mostly from the growth of AI-dependent data centers.
AI's growing dependency on energy-intensive processes is not only reshaping infrastructure at an unprecedented scale but also challenging the global push toward sustainability. As AI systems become more pervasive, their energy consumption is accelerating, and with it, questions about the long-term implications for the environment, energy resources, and technology itself. From data centers to inference models, the power demands of AI are poised to grow exponentially in the coming years and experts are taking note.
Some of AI's best minds gathered in Cambridge, Massachusetts, May 5-7, 2025, at the EmTech AI Conference, to discuss the sustainability implications of artificial intelligence. Editorial members of the esteemed MIT Technology Review publication explored the scope of AI's energy impact, where it's heading, and why it's imperative that decisions made today balance innovation with responsibility. Ultimately, these experts warn that we are at a critical juncture that will determine whether future AI innovations can coexist with global efforts to combat climate change.

MIT Technology Review from left to right: Will Douglas Heaven, Sr. Editor; James O'Donnell, AI Reporter; Mat Honan, Editor in Chief
The Growing Energy Demands of AI
AI's energy usage comes from two primary operations: training models and inference tasks. Training complex AI models, such as large language models and image generation systems like ChatGPT, Gemini, Llama, and Claude, require massive computational power and energy resources. Training processes for advanced models can require weeks or even months of operating fleets of high-performance GPUs, processing vast datasets. The carbon costs of this process are significant, and they represent only part of the challenge.
Inference, the step in which AI models respond to user inputs or requests (e.g., generating text, answering search queries, or creating art), is set to consume an even greater share of energy in the near future. Unlike training, inference happens on-demand and at scale. Every interaction with an AI system, whether a personal assistant like ChatGPT or a visual design tool, requires computing resources, creating a cumulative energy burden as these interactions multiply.
"We're moving to this world where multiple AI models are answering billions of queries daily, and those inferences are already a massive and growing energy hog," explained Mat Honan, MIT Technology Review's editor in chief. "And if we want to keep up with the expected demand, we're going to need a lot more power than we currently have."
The magnitude of this shift cannot be understated: as AI tools become more integrated into consumer and enterprise systems, the energy used to power these tools will continue to grow, raising complex questions about how much power our infrastructure can handle and its implications for the environment. So, where will this power come from?
Just How Much Energy Do We Need?
Establishing just how much energy will be required to meet future AI demands is tricky. Experts are merely speculating.
"I can't emphasize enough, we truly don't know anything about how much energy is used up when you ask a query or generate a video," MIT Technology Review AI Reporter, James O'Donnell said. "We make estimates about how much energy is used specifically in inferencing, and there's all sorts of ways that you can do that, but it's all very indirect."
And that's at the functional level, at the data center level it's still being figured out what energy sources are going to power this AI revolution.
The Role of Data Centers: Feeding AI's Energy Hunger
Data centers serve as the backbone of AI systems. These sprawling facilities house the servers and processors that enable AI models to function. However, their scale, complexity, and energy consumption are raising alarms globally.
Take Nevada, for example, where tech giants like Google and Apple have built vast data centers in what is already one of America's driest states. These facilities consume massive amounts of energy to keep high-performance servers operating at optimum temperatures. Cooling technologies are vital in these desert locations, requiring significant water use and power to sustain the operations. Nearly half of the energy demands of a data center comes from the need to cool it.

Apple solar data center, Nevada
The problem isn't limited to Nevada. In Louisiana, Meta is constructing its largest-ever data center, a project geared to support its AI ambitions. However, this venture relies heavily on power from natural gas plants, undermining clean energy goals. Such developments reveal troubling contradictions: while companies acknowledge the importance of sustainability, they remain tied to energy sources that perpetuate dependence on fossil fuels. "Renewables, for the most part, lack the kind of consistency a data center requires," Honan pointed out.
A surge in AI-driven electricity demand also places additional pressure on power grids, inadvertently accelerating the buildout of natural gas-based energy facilities. This reliance on fossil fuels, driven in part by AI's electricity needs, presents a serious challenge to climate objectives, as industries struggle to scale renewable solutions quickly enough to meet demand.
Can Nuclear Power Solve AI's Energy Crisis?
To meet the mounting energy demand posed by AI, many are turning to nuclear power as a potential solution. Viewed as a cleaner alternative to fossil fuels, nuclear energy offers a steady, reliable source of electricity that could, in theory, power future AI operations sustainably. However, there are significant barriers to this transition. "While nuclear may sound great, it takes years, sometimes a decade or more to bring a new plant online," Honan remarked.
Additionally, policy hurdles, public resistance, and high costs further complicate nuclear adoption. As a result, it may be years--if not decades--before nuclear energy can meaningfully contribute to fueling AI's growth at scale. While the idea holds promise, it is far from an immediate solution to AI's energy dilemma.

Source: Nuclear Innovation Alliance
Companies are investing heavily in exploring nuclear options as part of their long-term plans to decarbonize data infrastructure, particularly overseas. China's first nuclear power plant was connected to the grid in 1991, and within a few decades, it has developed the third-largest nuclear fleet globally, behind only France and the United States. This year, China is expected to bring four large reactors online, with several more planned for commissioning in 2026.
Why AI's Energy Problem Matters
At present, AI's energy footprint constitutes a relatively small portion of global electricity use. However, its trajectory is concerning. Analysts warn that if unchecked, AI's growing power consumption could have a cascading effect on broader electrification plans and efforts to combat climate change. The energy requirements of data centers represent a microcosm of the broader electrification challenges humanity faces: as society adopts new technologies reliant on electricity, ensuring sustainable energy practices becomes increasingly critical.
Moreover, AI's energy needs expose existing inequities in resource allocation. Communities hosting energy-intensive data centers may bear disproportionate environmental impacts, from water shortages to higher emissions associated with local power generation. This creates a moral imperative for policymakers and corporations to adopt equitable, sustainable strategies.
The Case for Optimism: Toward Energy-Efficient AI
Despite these challenges, there are reasons to feel optimistic about AI's energy future. Emerging technologies and innovations could significantly reduce the energy intensity of AI systems in years to come. Improvements in software and hardware, including the development of more efficient processors, are already underway.
"On the software side, there's always going to be efficiencies made in the way AI models themselves are made," MIT Technology Review Senior Editor, Will Douglas Heaven, said. "We can be smarter about training with more curated, specialized data where you don't need to throw everything in it and then run models for weeks and weeks on it."
Companies can also adopt better monitoring and reporting practices to enhance accountability. Transparent reporting on AI's energy use, coupled with clear targets for reducing environmental impact, could represent a critical step toward aligning technological innovation with climate goals. Renewable energy investments and technologies like liquid cooling are further reasons for cautious optimism, as they could minimize the environmental strain of data centers.
Lastly, the increasing emphasis on sustainable infrastructure aligns AI innovation with global electrification policies. Governments and the private sector alike are exploring pathways to design AI systems that contribute to renewable energy networks rather than strain them. The future of AI in renewable energy is incredibly promising, according to AON, with numerous advancements and applications that are set to revolutionize the sector.

Source: AON Website
Looking Forward
AI holds the potential to transform society in unprecedented ways. Yet, its rapid growth is a double-edged sword. The power-hungry nature of AI systems presents a litmus test for humanity's ability to manage technological progress responsibly while safeguarding our planet.
As the energy demands of AI mount, urgent steps must be taken to align innovation with sustainability. Solutions such as energy-efficient systems, equitable infrastructure planning, and investments in renewable energy are critical to mitigating AI's environmental impact. Importantly, collaboration between industry leaders, governments, and scientists will be essential to ensuring that AI's immense potential benefits humanity without undermining our shared future.
The coming years will define not just the trajectory of AI but also the legacy of our response to the challenges it poses on our planet. By addressing AI's energy consumption today, we can ensure that the technology powering the future is as sustainable as it is transformative.
Feature image compliments of MIT EmTech