Debate continues on the pros and cons of artificial intelligence and what guardrails are needed to manage its evolution, yet it’s clear that the technology is escalating and that more and more energy will be needed to power its expansion. Data centers, the facilities essential to the infrastructure for computing power, are proliferating, especially in areas where high-speed internet is available.
“The computing power needed for the applications of what people call AI are immense. These chips are pretty power hungry, and companies like Google, Microsoft, and Amazon are spending billions of dollars every quarter in hardware to meet the demand,” said Nicholas “Nick” Tsinoremas, vice provost for Research Computing and Data and founding director of the University of Miami Frost Institute for Data Science and Computing (IDSC).
Both Tsinoremas and David Kelly, economics professor and co-chair of the Miami Herbert Business School Sustainable Business Research Cluster, emphasized that the power needed for “cloud computing” is anything but an airy abstract.
“There’s a kind of misnomer, because the ‘cloud’ is this huge, physical plant—it’s not a cloud at all,” Kelly explained. “Anything done on the cloud is basically powered by data centers. Any time you say, ‘spell check,’ for a Google doc the electrons are beamed all the way to the data center. There’s a big computer that does the spell check and the answer is beamed back.”
And Tsinoremas noted that some calculations indicate that it takes 15 times the computing power to do a query in ChatGPT as it does for one in Google.
“So, it’s just a ginormous amount of electricity required for these centers. It’s cloud computing, but it’s also crypto and AI—the three of them together are the big consumers,” said Kelly, also the academic director of the Master of Science in Sustainable Business.
Kelly cited data from 2023 indicating that data centers are consuming 4.4 percent of all the electricity in the country or about 176 terawatt hours (TWh). By 2028, the centers are projected to consume as much as 12 percent of all electricity or 580 TWh .
Tsinoremas referred to the current trend of building specialized data centers for AI as the “second boom.”
“The proliferation of the ‘mining farms’ needed for Bitcoin and crypto—also extremely power hungry—began in 2019-2020, though that’s more of a niche,” he said. He pointed out that in 2019, bitcoin used the same amount of power as the country of Argentina. “And with AI, we’re talking about the need for a lot more.”
To supply the energy needed to power the data centers, the big firms—Microsoft, Amazon, and Google—secure power purchase agreements with utility companies, Kelly explained. These agreements to purchase power impact local communities.
“You can’t just build a data center and plug it into the wall; you have to go to the electric power utility and develop an agreement for how they’re going to come up with the extra electricity that this data center needs,” Kelly said.
He referenced Microsoft’s initiative to reopen the nuclear facility at Three Mile Island as an example.
The whole plant is to be used to power data centers. But the data centers do not connect to Three Mile Island generators, Kelly explained. Per the agreement they connect to the grid powered by Constellation Energy, a competitive supplier of electricity, natural gas, and renewable energy for the area.
“So, they restart Three Mile Island, supply the extra energy to the grid, and then the data center soaks that up,” Kelly said.
Both specialists caution that the trend to simply build more and more data centers is not sustainable.
“We do need this computing power because we are becoming more and more digital. But we can’t just scale these things, that’s unsustainable,” Tsinoremas said. “We also need better software and hardware to reduce the demand of the power that we will need.”
He referenced software innovations announced by the Chinese firm DeepSeek that purportedly require 5X less the computing power needed than ChatGPT, its U.S. competitor.
Tsinoremas highlighted that while some companies such as Envidia are exploring these innovations, private firms generally lack the incentive to invest long-term in research.
“All the innovation and research over the years came from academic research. Where was the Internet developed? It wasn’t developed by a private company,” he said. “These companies live quarter to quarter, and some of the research may take years—and nobody can wait that long in the private sector.
“There has to be some innovations where they reduce the power requirements; it can’t just be more and more building because that’s not sustainable—and there’s going to be a breaking point,” he added.
The breaking point could likely entail higher utility costs for consumers.
“There’s a huge amount of power that will be needed in the next few years, and if we build enough power to supply that electricity, then rates should stay the same,” Kelly said.
“But what happens if we can’t, if we’re unable to build them and connect them to the grid with all the transmission lines and get the permits, and everything else?” he pondered. “If we can’t do that, then there’s more demand, and the price has to go up.”