"AI Consumes Ten Times More Energy Than Search Engines"
Stargate Project Places Heavy Burden on Power Grid
Google, Microsoft, and Amazon Accelerate Power Plant Development
"We will build our own power generation facilities right next to the artificial intelligence (AI) factories (data centers)."
In a speech at the World Economic Forum (WEF) held in Davos, Switzerland, last January, U.S. President Donald Trump said that AI data centers require massive amounts of electricity. Earlier that month, on January 21, President Trump announced the $500 billion (approximately 720 trillion KRW) AI data center project "Stargate" at the White House, involving OpenAI, the developer of ChatGPT, Japan's SoftBank, and software company Oracle. If this plan proceeds as intended, it will become the world's largest data center. The first data center is expected to be built in Abilene, Texas.
Since the Stargate project was unveiled, the first question raised has been how to supply the electricity needed for the data centers. Bloomberg reported that SB Energy, a SoftBank subsidiary, will provide digital infrastructure and power, with some of the power sources being solar and batteries. In response, an OpenAI spokesperson stated that they are considering various options that could help the U.S. power grid, ranging from nuclear power to batteries.
"Global Data Center Electricity Demand Comparable to India"
As AI spreads, how to supply the enormous electricity required for data centers has become a challenge for countries worldwide. At the Davos Forum, President Trump said, "The biggest problem is that to make the AI we want a reality, we need twice the energy that the U.S. currently has." According to the International Energy Agency (IAEA), ChatGPT requires 2.9 watt-hours (Wh) of electricity to process one question. In contrast, a Google search uses 0.3 Wh. This means AI requires ten times more energy than a search.
The Nvidia AI graphics processing unit (GPU) H100, widely used in AI data centers, consumes 700 watts (W) of electricity, which is eight times that of a 60-inch flat-screen TV. xAI, founded by Elon Musk, reportedly uses 100,000 H100 units. Arithmetically, the GPUs used by xAI alone require 70 MW of power, which is roughly equivalent to the electricity consumption of 23,000 households. Nvidia's latest GPU, the B200, consumes nearly twice as much power as the H100.
Goldman Sachs released a forecast report in May last year predicting that global data center electricity demand will increase by 160% by 2030. Currently, data centers consume about 1-2% of global electricity, but this is expected to rise to 3-4% by 2030. Goldman Sachs projected that in the U.S., data centers will account for 8% of total electricity demand. To meet this demand, U.S. power companies would need to invest $50 billion (approximately 72 trillion KRW).
The U.S. Department of Energy (DOE) outlook aligns with this. In December last year, the DOE cited the Lawrence Berkeley National Laboratory's "2024 U.S. Data Center Energy Usage Report," stating that data center loads have tripled over the past decade and are expected to increase by 2 to 3 times more by 2028. President Trump's statement that "twice the energy is needed" appears to reference this report.
The surge in electricity demand for AI data centers is a global issue. According to Bloomberg, Sweden's data center electricity demand is expected to double by 2030. The UK's energy company National Grid predicted that data center demand will increase by 500% over the next decade due to the AI boom. Bloomberg forecasts that by 2034, global data center electricity consumption will reach 1,580 TWh, comparable to the entire electricity usage of India.
U.S. Big Tech Turns Back to Fossil Fuels
The Stargate project is expected to place a significant burden on the U.S. power grid. Shortly after taking office, President Trump signed an executive order declaring an "energy emergency," citing the surge in electricity demand for data centers and digital infrastructure as a major reason. It appears that President Trump will use this executive order to address the data center electricity demand issue.
Attention is inevitably focused on power supply. During former President Joe Biden's administration, there was a strong movement to use eco-friendly renewable energy sources such as solar and wind power for data center electricity. In contrast, the second Trump administration is expected to consider various energy sources, including nuclear and fossil fuels. Barron's forecasted, "Whether nuclear or fossil fuel power plants, AI infrastructure construction will accelerate."
U.S. big tech companies have begun showing renewed interest in fossil fuels to cope with the surging electricity demand. In December last year, Meta announced plans to build a 4 million square foot data center in Louisiana, which will be powered by a natural gas power plant. According to The Washington Post (WP), Microsoft (MS) is seeking a gas power plant to build a $3.3 billion data center in Wisconsin. Google, MS, and Amazon have also signed power purchase agreements with nuclear power plants to supply electricity for their data centers.
U.S. oil and gas companies are rushing to build gas-fired power plants. Chevron announced plans to build a 4 GW gas power plant with Engine No. 1 to supply power to data centers in the southeastern, midwestern, and western U.S. ExxonMobil revealed plans late last year to develop a 1.5 GW gas-fired power plant to supply electricity to data centers.
Dan Brouillette, who served as U.S. Secretary of Energy during Trump's first term, told WP, "The important point is that current technology cannot and will not provide the amount of energy they want from renewable sources."
China's DeepSeek Shock Unrelated to Power Demand
Last January, Chinese startup DeepSeek caused a global stir by developing and announcing "DeepSeek R1," which performs similarly to OpenAI's ChatGPT but with significantly lower development costs. It also reportedly consumes 10 to 40 times less electricity than U.S. AI.
After DeepSeek was unveiled, some questioned whether the electricity demand for AI had been exaggerated. However, there is also considerable pushback arguing that even if AI models with low electricity consumption are developed, it will have little impact on the overall increase in usage.
A computer expert at Boston University, Aise Koscun, said, "(DeepSeek) will bring efficiency, but it is questionable whether AI energy demand can be continuously reduced." Olivier Blum, CEO of Schneider Electric, told the UK's Financial Times (FT) in an interview that despite DeepSeek, their data center growth forecasts have not changed.
© The Asia Business Daily(www.asiae.co.kr). All rights reserved.
![[AI Era Electric Power is National Power]② "Currently Need Twice the US Energy"... Nuclear and Thermal Power Make a Comeback](https://cphoto.asiae.co.kr/listimglink/1/2025030410163353004_1741050993.png)

