
- AI data centers are overwhelming national grids and driving energy costs higher
- Companies are turning to nuclear options to sustain power-hungry AI workloads
- OpenAI urges the government to massively expand national energy generation capacity
Microsoft CEO Satya Nadella has drawn attention to a less discussed obstacle in the AI race – a shortage not of processors but of power.
Speaking on a podcast alongside OpenAI CEO Sam Altman, Nadella said Microsoft has, “a bunch of chips sitting in inventory that I can’t plug in.”
“The biggest issue we are now having is not a compute glut, but it’s power — it’s sort of the ability to get the builds done fast enough close to power,” Nadella added, “in fact, that is my problem today. It’s not a supply issue of chips; it’s actually the fact that I don’t have warm shells to plug into.”
Energy limitations reshape the AI landscape
Nadella explained that while the supply of GPUs is currently sufficient, the lack of suitable facilities to power them has become a critical issue.
In this context, he described “warm shells” as empty data center buildings ready to house hardware but dependent on access to adequate energy infrastructure.
This shows that the explosive growth of AI tools has exposed vulnerabilities, and the demand for computing capacity has outpaced the ability to construct and power new data center sites.
Energy planning across the technology industry is such a big issue, and even large companies like Microsoft with vast resources still struggle to keep up.
To solve this issue, some firms, including major cloud providers, are now researching nuclear-based energy solutions to sustain their rapid expansion.
Nadella’s comments reflect a broader concern that AI infrastructure is stretching national electricity grids to their limits.
As data center construction accelerates across the United States, power-intensive AI workloads have already begun influencing consumer electricity prices.
OpenAI has even urged the US government to commit to building 100 gigawatts of new power generation capacity annually.
It argues that energy security is becoming as important as semiconductor access in the competition with China.
Analysts have pointed out Beijing’s head start in hydropower and nuclear development could give it an advantage in maintaining AI infrastructure at scale.
Altman also hinted at a potential shift toward more capable consumer devices that could one day run advanced models such as GPT-5 or GPT-6 locally.
If CPU and chip innovation enable such low-power systems, much of the projected demand for cloud-based AI processing might disappear.
This possibility presents a long-term risk for companies investing heavily in massive data center networks.
Some experts believe such a shift could even accelerate the eventual bursting of what they describe as an AI-driven economic bubble, which could threaten trillions of dollars in market value should expectations collapse.
Via TomsHardware
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.