China's AI data center boom goes bust: Rush leaves billions of dollars in idle infrastructure
Tens of billions invested, billions lost.

Triggered by the rise of generative AI applications, China rapidly expanded its AI infrastructure in 2023 – 2024 and built hundreds of new data centers using both state and private funding. But this boom has since lost momentum. Facilities that cost billions of dollars now sit underused, returns are falling, and the market for GPU rentals has collapsed. To make the matters worse, many data centers became outdated before they were even fully operational as market conditions have changed, according to MIT Technology Review.
Rushed data centers do not earn money
The sudden drop in real estate activity following the 2020 COVID-19 pandemic increased pressure to find new economic drivers, and the rise of ChatGPT in late 2022 made AI seem like the next big thing. In 2023 alone, more than 500 data center projects were proposed nationwide, according to KZ Consulting. By late 2024, at least 150 projects were reportedly operational. Local authorities promoted these projects, hoping to boost their regional economies. State-owned companies, government-linked investment funds as well as private companies and investors were eager to back these data centers.
But as usually happens with rushed projects, poor planning were their downfall. For example, some facilities were often built without regard for actual demand or technical standards, according to MIT Technology Review's sources among project leads and executives. This is not particularly surprising as engineers with the relevant experience are rare, and many executives depended on middlemen who inflated projections or exploited procurement to get subsidies. As a consequence, many new data centers fell short of expectations as they are expensive to run, difficult to fill, and technically irrelevant for contemporary AI workloads.
To make the matters even more complicated, some projects never planned to profit from computation at all. According to multiple reports and industry insiders cited by MIT Technology Review, certain companies used AI data centers to qualify for government-subsidized green energy or land deals. In some cases, electricity earmarked for AI tasks was sold back into the grid for a mark-up. Others secured loans and tax incentives while leaving buildings unused. By late 2024, most people still in the business were aiming to benefit from policy incentives rather than actual AI work, the report claims.
The AI data center market is changing
When massive AI data centers were built in 2023 – 2024, the envisioned demand for AI training and AI inference performance requirements was different than the actual demand we see today.
Nowadays demand is shifting towards inference as this is what makes money for owners of AI models. Inference workloads do not necessarily require massive clusters based on tens of thousands of high-end Nvidia GPUs that are used for training. By contrast, inference workloads can benefit from specialized accelerators with lower cost and power consumption, but faster response times. As a result, monthly rental prices for an H100 server with eight GPUs designed for training have plummeted from ¥180,000 ($24,000) to ¥75,000 ($10,000). Interestingly, despite export restrictions, the H100 continues to flow steadily.
As a result, massive rural or inland locations are now far less attractive despite their lower costs. Consequently, some data centers now offer free computing vouchers to local tech firms, but still get underused. By contrast, other data center operators often shut down facilities entirely rather than risk losses from electricity and maintenance that partial rental income cannot cover.
Stay On the Cutting Edge: Get the Tom's Hardware Newsletter
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
One of the biggest shifts came with the rise of DeepSeek, which released a reasoning model called R1 that achieved performance comparable to ChatGPT o1 but at significantly lower cost. This made many AI companies rethink their requirements for hardware and scale.
Despite the setbacks, central authorities remain committed to AI development. A government symposium held in early 2025 reaffirmed the need for national self-reliance in this area. Major firms have followed suit: Alibaba announced over $50 billion in planned investments for cloud and AI infrastructure, and ByteDance committed another $20 billion.
Insiders believe Chinese officials will not abandon these projects, viewing them as growing pains rather than failures. The government is expected to take over floundering centers and assign them to more capable operators. However, for those operators that cannot rent their capacity to clients that can pay, the bubble has clearly gone bust.

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.
-
dalek1234 What?! Are my eyes deceiving me? THD has actually published an article about something bad in china, instead of peddling chinese government's sponsored state-propaganda full of Bee-Ehss.Reply
Hell just froze over.
It's actually worth clicking on the article link now. -
SomeoneElse23 I'm still in the "AI is a bubble" boat.Reply
It'll turn out to be Search 2.0.
Which is a whole lot better than what we have today, but it's far from "intelligent". -
Lieutenant Barclay
That's how I feel. The idea that these machines are "thinking for themselves" or have any kind of agenda is ridiculous. The hype level reminds me of "the God particle" and all the crap about the large hadron collider (apparently dyslexia wasn't considered when naming that thing)SomeoneElse23 said:I'm still in the "AI is a bubble" boat.
It'll turn out to be Search 2.0.
Which is a whole lot better than what we have today, but it's far from "intelligent". -
husker
The AI industry doesn't claim that AI is in anyway sentient or "thinking for themselves". As far as AI having some kind of an agenda - please save that for Sci Fi entertainment. Many people know what E=MC^2 means, but did not derive or think of the equation on their own. Nor did they derive the vast majority of their knowledge "for themselves". Instead, they learned it from some source material and have the ability to apply it in some way as requested. Sounds a lot like AI to me.Lieutenant Barclay said:That's how I feel. The idea that these machines are "thinking for themselves" or have any kind of agenda is ridiculous. The hype level reminds me of "the God particle" and all the crap about the large hadron collider (apparently dyslexia wasn't considered when naming that thing)
Edit: If computers ever do reach a point of sentience and thinking for themselves with an agenda, then it won't be called "Artificial Intelligence", it will just be "Non-biological Intelligence" because there won't be anything artificial about it. A twist on a phrase from Forest Gump: Intelligence is as intelligence does. -
DS426 So what I'm hearing is that Alibaba and ByteDance should purchase some to many of these AI datacenters and modify them for their needs, costing them pennies on the dollar (well, yuan of course) as their current owners would surely love to get bailed out at this point (besides the CCP just bailing them out).Reply