Nearly 7,000 of the world’s 8,808 data centers are built in the wrong climate, analysis find — vast majority located outside optimal temperature range for cooling, 600 in locations considered too hot
Most facilities sit outside the temperature range recommended for efficient operation, as AI growth pushes data centers into hotter regions.
Nearly 7,000 of the world’s 8,808 operational data centers are located in climates that fall outside the temperature range recommended for efficient operation, according to a new analysis that maps global data center locations against long-term climate data. While only a minority are in regions that are persistently too hot, the findings underline how economic, political, and network realities often outweigh environmental suitability when companies decide where to build.
The analysis, published by Rest of World, combines location data for 8,808 operational data centers with historical temperature records from the Copernicus Climate Data Store. It compares those locations against guidance from the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE), which recommends that data centers operate most efficiently when inlet air temperatures fall between 18 C and 27 C. Above that band, cooling systems work harder, energy use rises, and costs increase. Below, condensation and reliability can become an inhibiting factor.
Based on that definition, nearly 7,000 data centers worldwide sit outside the recommended temperature range. The majority of those are in cooler regions below 18 C, where temperature is less of a constraint, but humidity management and airflow are more important. Around 600 facilities, or under 10% of the total, are located in areas with average annual temperatures above 27 C, where heat is a persistent challenge.
In 21 nations, including Singapore, Thailand, Nigeria, and the United Arab Emirates, every operational data center is located in a zone classified as too hot under the ASHRAE recommendation. Nearly all facilities in Saudi Arabia and Malaysia fall into the same category. In Indonesia, close to half of the country’s roughly 170 data centers are in overly hot regions, while in India, about 30% of its more than 200 sites are exposed to sustained high temperatures.
Singapore, with average daily temperatures hovering near 33 C and humidity frequently above 80%, has one of the densest concentrations of data centers in the world, with more than 1.4 gigawatts of capacity already online, and the government plans to allow several hundred additional megawatts under tighter efficiency rules. Data centers accounted for about 7% of Singapore’s electricity use in 2020, a share projected to rise sharply without intervention.
The pressure to build in less-than-ideal, borderline unsuitable climates is only accelerating globally. Demand for data centers to support cloud services and generative AI is rising fast, particularly in regions that are also among the hottest. At the same time, governments increasingly require data to be stored within national borders, limiting the option to centralize workloads in cooler locations such as Scandinavia. As a result, data centers are spreading geographically rather than clustering only where cooling is cheapest.
Even when data laws don't apply, decisions around data center locations are often guided by the availability (and the cost) of power and water. Other factors include the price of land, the frequency of natural disasters, and local governance factors like tax exemptions and building permits. Essentially, ambient temperature is one of a myriad of factors steering data center build-outs, which could explain why so many don't fit into ASHRAE's optimal temperature range.
Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.
Higher ambient temperatures bring compounding risks, with increased cooling loads putting strain on local power grids while also reducing the efficiency of electricity transmission. According to the International Energy Agency, data centers consumed about 415 TWh of electricity in 2024, roughly 1.5% of global demand. That figure is expected to more than double by 2030 as AI workloads scale, intensifying the impact of where new capacity is built.
Operators are responding by rethinking how facilities are cooled. Air cooling still dominates globally, representing 54% of the market, but liquid-based alternatives are catching up, particularly for use in dense AI racks where a Blackwell Ultra can consume as much as 140 kilowatts.
Even so, retrofitting existing facilities is expensive, and many of the world’s hottest data center markets are also those with the most constrained power and water resources. Risk analysts warn that by 2040, extreme heat could materially affect two-thirds of major data center hubs worldwide, including all major hubs in the Asia-Pacific and the Middle East.
Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Luke James is a freelance writer and journalist. Although his background is in legal, he has a personal interest in all things tech, especially hardware and microelectronics, and anything regulatory.
-
Notton The analysis report in this article is so weird.Reply
Yeah, obviously you have to build in sub-optimal locations because you'd have to invade and colonize another nation to build in the "sweet spot".
It's a valid strategy in Civ6, but IDK how I'd feel about it in the real world. -
SnowdawgyCan They are not Surprisingly in countries where the rules are soft and labor is available at slave like prices....go figure. Not a shock to anyone that actually thinks about the issues. Just like why North American manufactures seem to choose 3rd world countries for their factories.Reply -
King_V Ultimately, it's, I think, a situation of "But it's cheaper to do it like this RIGHT NOW, and to hell with the long term consequences."Reply
That said, I do wonder, assuming all other factors are equal, if building in a sub-optimal temperature zone, which problem is easier to mitigate? The need for extra cooling in a too-hot area, or the need to de-humidify/avoid condensation in a too-cold area?
My gut instinct, with no REAL analysis, would be the latter, as the cooling feels like the harder problem to solve, and the cold climate's humidity/condensation issues should be easier. I'd be curious as to thoughts from those who know more about dealing with such issues. -
ezst036 Idealism is not always logistically practical if a country does not have land in a cooler climate nor the manpower or other needs to make it happen.Reply
Perhaps building data centers in Greenland would be ideal, but does Greenland have a population large enough to sustain a workforce for it? Do they have nuclear power plants to keep them electrically running? Would Greenlanders protest the building of a nuclear power plant (which would thus be a blocker for data centers) and thus Greenlanders would be cutting off their noses to spite their faces not just due to killing future jobs for themselves but because they'd be being bad environmental stewards by forcing data centers into hot climates? (You see what I did there? It's not entirely wrong.)
The U.S,, at least, might do well to see more data centers built in Alaska in suitable locations perhaps like Anchorage or Fairbanks. But how is UAE going to do anything like that? Other than building underwater data centers like Microsoft attempted to do perhaps.
Hot countries with no sea border supposed to build data centers at the tops of mountains where elevation helps temperatures?
It just makes little sense and lacks pragmatic realism. -
SkyBill40 Not even remotely surprising to read on this. There's a ton of datacenters being built here in AZ and, well, we're not exactly a cool climate. The increased strain on power grids is a significant problem as well, and it's one that the data folks aren't paying. We, regular consumers, are having to foot that hike in rates and it's not acceptable.Reply -
Persister Location involves many things but one not mentioned is transmitting data to and from. A remote or underwater location must increase cost of this but IDK where this expense ranks compared to others. Media stories about data centers in space, so it must be wireless transmission is easy . . . .:confused_old:Reply -
kenroyal A few things the industry has overlooked.Reply
1) your Tier uptime rating is for a concrete shell not the most expensive and important part of the AI Factory, the GPU, CPU, TPU etc. the racks fail at 1% or rather the GPU cooling to the rack is failing at 1% or more per year, what is 1% failure in Tier rating? Go ahead, look it up, it's bad.
2) you can run lower temps, below dew point as long as you protect the pcba. An atomized 3D gel state coating has been used on 400vdc cold plate pcbas for 10 years in the EV market. You can run cooler and get 5% more GPU performance, one hyper scaler is in trials right now. -
bill001g This is what is almost silly about air conditioning. It work most effectively in areas that do not need it in the first place.Reply
They could just take massive fans and pull outside air into the data centers. In areas that the summer temps are only 27c they then need to worry about the freezing in the winter. It is very likely there are many weeks where the temperature is well below freezing. Not likely a huge issue since they have lots of heat to get rid of just have to take less air in and mix it with the inside air first so it is not extremely cold in parts of the data center. -
Sam Hobbs Another consideration is the cost of communications. If there is insufficient copper or fiber cables for the data then it would cost to lay lines for communication of data.Reply
Also, the cost of personnel is a consideration. Will there be a permanent need for on-site personnel? -
wwenze1 Reply
Insert King of the Hill Phoenix, Arizona jokeSkyBill40 said:Not even remotely surprising to read on this. There's a ton of datacenters being built here in AZ and, well, we're not exactly a cool climate. The increased strain on power grids is a significant problem as well, and it's one that the data folks aren't paying. We, regular consumers, are having to foot that hike in rates and it's not acceptable.
But honestly, why does AZ exist