OpenAI's gargantuan data center is even bigger than Elon Musk's xAI Colossus — world's largest 300 MW AI data center could reach record 1 gigawatt scale by next year, threatens grid stability
It's the world's largest single building

Elon Musk's xAI made quite a splash when it built a data center with 200,000 GPUs that consumes approximately 250 MW of power. However, it appears that OpenAI has an even larger data center in Texas, which consumes 300 MW and houses hundreds of thousands of AI GPUs, details of which were not disclosed. Furthermore, the company is expanding the site, and by mid-2026, it aims to reach a gigawatt scale, according to SemiAnalysis. Such gargantuan AI clusters are creating challenges for power companies not only in power generation but also in power grid safety.
OpenAI appears to operate what is described as the world's largest single data center building, with an IT load capacity of around 300 MW and a maximum power capacity of approximately 500 MW. This facility includes 210 air-cooled substations and a massive on-site electrical substation, which further highlights its immense scale. A second identical building is already under construction on the same site as of January 2025. When completed, this expansion will bring the total capacity of the campus to around a gigawatt, a record.
These developments have drawn attention from the Electric Reliability Council of Texas (ERCOT), the organization responsible for overseeing the Texas power grid, because of the unprecedented size and energy demand of such sites. The power consumption profile of these data centers, combined with their rapid growth, presents serious challenges for energy supply companies for several reasons.
Firstly, hundreds of thousands of AI accelerators (such as Nvidia's H100 or B200) and servers on their base consume an immense amount of power and require a huge and continuous supply of electricity, often equivalent to what a mid-sized city consumes. Supplying this kind of load forces power companies to build or upgrade substations, transmission lines, and generation capacity far faster than usual. This stretches both financial and physical infrastructure planning, especially in regions that were not prepared for such rapid growth.
Secondly, the way these data centers use power is unstable. Unlike traditional factories or office buildings that draw power steadily, AI-focused data centers can swing from maximum demand to minimal usage in moments. This kind of behavior places enormous stress on grid management, as even slight imbalances between supply and demand can cause voltage and frequency issues.
Specifically, when more electricity is produced than needed, both voltage and frequency rise above their normal levels. If demand outpaces supply, they drop below standard values. Even a 10% deviation in either direction can damage electronics or trigger circuit protection. It is the grid operator’s responsibility to keep these parameters within safe limits to ensure system stability. However, if several large data centers (or one giant data center, such as the one used by OpenAI) suddenly reduce their power draw, it could send shockwaves through the rest of the grid, causing other power consumers or generators to shut down, and potentially triggering a chain of failures.
Thirdly, integrating these data centers into the grid requires complex coordination with regional planning authorities, which typically conduct studies to understand the effects on transmission stability and to prevent conflicts with other grid users. However, these studies are time-consuming and often lag behind the speed at which data centers are built.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Finally, there is an economic challenge as power companies may need to spend billions to satisfy the demands of large data centers. However, the unpredictable nature of the AI industry means return on that investment is hard to model. At the same time, if the grid is not upgraded fast enough, there is a risk of blackouts or turning down industrial customers who cannot compete for limited grid capacity.
Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
-
bit_user
Imagine someone does a DDOS attack that's basically just submitting a whole bunch of queries during a period that's normally low activity. Then, as suddenly as they started, they all stop. Do that several times and maybe trigger a grid outage. If the datacenter is used to service any government or military contracts, or other datacenters on the same power network, then it could be part of a larger cyberattack to undermine military readiness or responsiveness.The article said:However, if several large data centers (or one giant data center, such as the one used by OpenAI) suddenly reduce their power draw, it could send shockwaves through the rest of the grid, causing other power consumers or generators to shut down, and potentially triggering a chain of failures.
Unless they find an exploit allowing them to send free queries, it might require having paid accounts, but a state actor could certainly afford that. -
derekullo Dr. Emmett Brown: No, no, no, no, no, this sucker's electrical, but I need a nuclear reaction to generate the 1.21 gigawatts of electricity I need.Reply
Guess we made it full circle ! -
derekullo
I'd imagine they have some way to isolate the load from the grid, such as a large bank of capacitors in the "massive on-site electrical substation".bit_user said:Imagine someone does a DDOS attack that's basically just submitting a whole bunch of queries during a period that's normally low activity. Then, as suddenly as they started, they all stop. Do that several times and maybe trigger a grid outage. If the datacenter is used to service any government or military contracts, or other datacenters on the same power network, then it could be part of a larger cyberattack to undermine military readiness or responsiveness.
Unless they find an exploit allowing them to send free queries, it might require having paid accounts, but a state actor could certainly afford that.
Or you could simply guarantee a minimum load by doing other calculations in the background if the load becomes too low ... mine crypto. -
Zescion Not an expert, but I don't think datacenters are connected directly to the grid. To my knowledge, large capacitors are used to smooth energy demand fluctuations.Reply
The problem still stands, who is going to pay for those large systems. -
JRStern
There are big voltage stabilizers, generally mechanical (spinning), were in the news recently because Spain's grid collapsed through insufficient regulation (probably), perhaps intentionally being tested (ROFLMAO). Something like that might indeed be appropriate here. In theory might even be done more electronically including giant ultra-capacitors but I don't think that has ever actually been done on such a scale.Zescion said:Not an expert, but I don't think datacenters are connected directly to the grid. To my knowledge, large capacitors are used to smooth energy demand fluctuations.
The problem still stands, who is going to pay for those large systems. -
SomeoneElse23 Maybe a reality check of the power requirements will bring an end to this "AI" insanity.Reply
Maybe. -
DS426 Congress needs to be brought up to speed on this and pass regulation on this as the uncertainties are just too high for all parties -- private corporations, utility providers, and the public. I'm typically not one for regulation, btw... Anyways, start with mandating any datacenter or any other kind of electricity-consuming entity to be isolated to its own power system if over N Megawatts, e.g. >= 100 MW requires an isolated power system. This would ensure both reliability for the isolated system as a failure in the rest of the grid wouldn't bring it down and vice-versa. A private utility provider would be required -- forget having this kind of burden on a public utility that already has enough problems of their own in terms of grid stability and cybersecurity-related resilience.Reply
Water consumption would also need to be addressed. Evaporative cooling is efficient but over-relied on by some corps like Microsoft. Geothermal seems to remain rare for datacenter cooling, granted I understand it's more efficient for heating than for cooling. Guessing it's a considered a cost issue??
I'm all for innovation, but it has to be responsible. It doesn't matter how great the opportunity of AI is; wreckless growth can and does have serious consequences.