AI boom pushing power grid to limits as data centers triple energy demand

Posted by Kurt Knutsson, CyberGuy Report | 3 hours ago | Fox News | Views: 7


NEWYou can now listen to Fox News articles!

Every time you ask ChatGPT a question, to generate an image or let artificial intelligence summarize your email, something big is happening behind the scenes. Not on your device, but in sprawling data centers filled with servers, GPUs and cooling systems that require massive amounts of electricity. 

The modern AI boom is pushing our power grid to its limits. ChatGPT alone processes roughly 1 billion queries per day, each requiring data center resources far beyond what’s on your device.

In fact, the energy needed to support artificial intelligence is rising so quickly that it has already delayed the retirement of several coal plants in the U.S., with more delays expected. Some experts warn that the AI arms race is outpacing the infrastructure meant to support it. Others argue it could spark long-overdue clean energy innovation.

AI isn’t just reshaping apps and search engines. It’s also reshaping how we build, fuel and regulate the digital world. The race to scale up AI capabilities is accelerating faster than most infrastructure can handle, and energy is becoming the next major bottleneck.

TRUMP’S NUCLEAR STRATEGY TAKES SHAPE AS FORMER MANHATTAN PROJECT SITE POWERS UP FOR AI RACE AGAINST CHINA

Here’s a look at how AI is changing the energy equation, and what it might mean for our climate future.

Sign up for my FREE CyberGuy Report
Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide – free when you join.

ChatGPT

ChatGPT on a computer   (Kurt “CyberGuy” Knutsson)

Why AI uses so much power, and what drives the demand

Running artificial intelligence at scale requires enormous computational power. Unlike traditional internet activity, which mostly involves pulling up stored information, AI tools perform intensive real-time processing. Whether training massive language models or responding to user prompts, AI systems rely on specialized hardware like GPUs (graphics processing unit) that consume far more power than legacy servers. GPUs are designed to handle many calculations in parallel, which is perfect for the matrix-heavy workloads that power generative AI and deep learning systems.

To give you an idea of scale: one Nvidia H100 GPU, commonly used in AI training, consumes up to 700 watts on its own. Training a single large AI model like GPT-4 may require thousands of these GPUs running continuously for weeks. Multiply that across dozens of models and hundreds of data centers, and the numbers escalate quickly. A traditional data center rack might use around 8 kilowatts (kW) of power. An AI-optimized rack using GPUs can demand 45-55 kW or more. Multiply that across an entire building or campus of racks, and the difference is staggering.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

Cooling all that hardware adds another layer of energy demand. Keeping AI servers from overheating accounts for 30-55% of a data center’s total power use. Advanced cooling methods like liquid immersion are helping, but scaling those across the industry will take time.

On the upside, AI researchers are developing more efficient ways to run these systems. One promising approach is the “mixture of experts” model architecture, which activates only a portion of the full model for each task. This method can significantly reduce the amount of energy required without sacrificing performance.

How much power are we talking about?

In 2023, global data centers consumed about 500 terawatt-hours (TWh) of electricity. That is enough to power every home in California, Texas and Florida combined for an entire year. By 2030, the number could triple, with AI as the main driver.

To put it into perspective, the average home uses about 30 kilowatt-hours per day. One terawatt-hour is a billion times larger than a kilowatt-hour. That means 1 TWh could power 33 million homes for a day. 

a data center

Data center   (Kurt “CyberGuy” Knutsson)

5 AI TERMS YOU KEEP HEARING AND WHAT THEY ACTUALLY MEAN

AI’s energy demand is outpacing the power grid

The demand for AI is growing faster than the energy grid can adapt. In the U.S., data center electricity use is expected to surpass 600 TWh by 2030, tripling current levels. Meeting that demand requires the equivalent of adding 14 large power plants to the grid. Large AI data centers can each require 100–500 megawatts (MW), and the largest facilities may soon exceed 1 gigawatt (GW), which is about as much as a nuclear power plant or a small U.S. state. One 1 GW data center could consume more power than the entire city of San Francisco. Multiply that by a few dozen campuses across the country, and you start to see how quickly this demand adds up.

To keep up, utilities across the country are delaying coal plant retirements, expanding natural gas infrastructure and shelving clean energy projects. In states like Utah, Georgia and Wisconsin, energy regulators have approved new fossil fuel investments directly linked to data center growth. By 2035, data centers could account for 8.6% of all U.S. electricity demand, up from 3.5% today.

Despite public pledges to support sustainability, tech companies are inadvertently driving a fossil fuel resurgence. For the average person, this shift could increase electricity costs, strain regional energy supplies and complicate state-level clean energy goals.

power grid center

Power grid facility    (Kurt “CyberGuy” Knutsson)

Can big tech keep its green energy promises?

Tech giants Microsoft, Google, Amazon and Meta all claim they are working toward a net-zero emissions future. In simple terms, this means balancing the amount of greenhouse gases they emit with the amount they remove or offset, ideally bringing their net contribution to climate change down to zero.

These companies purchase large amounts of renewable energy to offset their usage and invest in next-generation energy solutions. For example, Microsoft has a contract with fusion start-up Helion to supply clean electricity by 2028.

However, critics argue these clean energy purchases do not reflect the reality on the ground. Because the grid is shared, even if a tech company buys solar or wind power on paper, fossil fuels often fill the gap for everyone else.

Some researchers say this model is more beneficial for company accounting than for climate progress. While the numbers might look clean on a corporate emissions report, the actual energy powering the grid still includes coal and gas. MicrosoftGoogle and Amazon have pledged to power their data centers with 100% renewable energy, but because the grid is shared, fossil fuels often fill the gap when renewables aren’t available.

Some critics argue that voluntary pledges alone are not enough. Unlike traditional industries, there is no standardized regulatory framework requiring tech companies to disclose detailed energy usage from AI operations. This lack of transparency makes it harder to track whether green pledges are translating into meaningful action, especially as workloads shift to third-party contractors or overseas operations.

wind farm

A wind energy farm   (Kurt “CyberGuy” Knutsson)

AI CYBERSECURITY RISKS AND DEEPFAKE SCAMS ON THE RISE

The future of clean energy for AI and its limits

To meet soaring energy needs without worsening emissions, tech companies are investing in advanced energy projects. These include small nuclear reactors built directly next to data centers, deep geothermal systems and nuclear fusion.

While promising, these technologies face enormous technical and regulatory hurdles. Fusion, for example, has never reached commercial break-even, meaning it has yet to produce more energy than it consumes. Even the most optimistic experts say we may not see scalable fusion before the 2030s.

Beyond the technical barriers, many people have concerns about the safety, cost and long-term waste management of new nuclear systems. While proponents argue these designs are safer and more efficient, public skepticism remains a real hurdle. Community resistance is also a factor. In some regions, proposals for nuclear microreactors or geothermal drilling have faced delays due to concerns over safety, noise and environmental harm. Building new data centers and associated power infrastructure can take up to seven years, due to permitting, land acquisition and construction challenges.

Google recently activated a geothermal project in Nevada, but it only generates enough power for a few thousand homes. The next phase may be able to power a single data center by 2028. Meanwhile, companies like Amazon and Microsoft continue building sites that consume more power than entire citie.

SCAMMERS CAN EXPLOIT YOUR DATA FROM JUST ONE CHATGPT SEARCH

Will AI help or harm the environment?

This is the central debate. Advocates argue that AI could ultimately help accelerate climate progress by optimizing energy grids, modeling emissions patterns and inventing better clean technology. Microsoft and Google have both cited these uses in their public statements. But critics warn that the current trajectory is unsustainable. Without major breakthroughs or stricter policy frameworks, the energy cost of AI may overwhelm climate gains. A recent forecast estimated that AI could add 1.7 gigatons of carbon dioxide to global emissions between 2025 and 2030, roughly 4% more than the entire annual emissions of the U.S.

Water use, rare mineral demand and land-use conflicts are also emerging concerns as AI infrastructure expands. Large data centers often require millions of gallons of water for cooling each year, which can strain local water supplies. The demand for critical minerals like lithium, cobalt and rare earth elements — used in servers, cooling systems and power electronics — creates additional pressure on supply chains and mining operations. In some areas, communities are pushing back against land being rezoned for large-scale tech development.

Rapid hardware turnover is also adding to the environmental toll. As AI systems evolve quickly, older GPUs and accelerators are replaced more frequently, creating significant electronic waste. Without strong recycling programs in place, much of this equipment ends up in landfills or is exported to developing countries.

The question isn’t just whether AI can become cleaner over time. It’s whether we can scale the infrastructure needed to support it without falling back on fossil fuels. Meeting that challenge will require tighter collaboration between tech companies, utilities and policymakers. Some experts warn that AI could either help fight climate change or make it worse, and the outcome depends entirely on how we choose to power the future of computing.

HOW TO LOWER YOUR CAR INSURANCE COSTS IN 2025

Kurt’s key takeaways

AI is revolutionizing how we work, but it is also transforming how we use energy. Data centers powering AI systems are becoming some of the world’s largest electricity consumers. Tech companies are betting big on futuristic solutions, but the reality is that many fossil fuel plants are staying online longer just to meet AI’s rising energy demand. Whether AI ends up helping or hurting the climate may depend on how quickly clean energy breakthroughs catch up and how honestly we measure progress.

CLICK HERE TO GET THE FOX NEWS APP

Is artificial intelligence worth the real-world cost of fossil resurgence? Let us know your thoughts by writing to us at Cyberguy.com/Contact

For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter

Ask Kurt a question or let us know what stories you’d like us to cover

Follow Kurt on his social channels

Answers to the most asked CyberGuy questions:

New from Kurt:

Copyright 2025 CyberGuy.com.  All rights reserved.  



Fox News

Leave a Reply

Your email address will not be published. Required fields are marked *