Microsoft announces 'world's most powerful' AI data center — 315-acre site to house 'hundreds of thousands' of Nvidia GPUs and enough fiber to circle the Earth 4.5 times

Aerial view of Microsoft's Fairwater datacenter in Microsoft
(Image credit: Microsoft)

Microsoft is planning to bring the "world's most powerful" AI datacenter online in early 2026, the company announced today. The Mount Pleasant, Wisconsin-based datacenter, dubbed Fairwater, is. meant specifically for training AI models as well as running large-scale models. The datacenter will be housed on 315 acres of land, with 1.2 million square feet in three buildings to house "hundreds of thousands" of Nvidia GB200 and GB300 GPUs.

Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

Andrew E. Freedman

Andrew E. Freedman is a senior editor at Tom's Hardware focusing on laptops, desktops and gaming. He also keeps up with the latest news. A lover of all things gaming and tech, his previous work has shown up in Tom's Guide, Laptop Mag, Kotaku, PCMag and Complex, among others. Follow him on Threads @FreedmanAE and BlueSky @andrewfreedman.net. You can send him tips on Signal: andrewfreedman.01

  • DS426
    300 MW of heat actually being dissipated up north by somewhat cooler air and non-evaporative cooling... Microsoft actually showing some sanity for once!?
    Reply
  • hotaru251
    alternative title: Wisconsin population about to see a big jump in their power bill due to mega corp setting up shop
    Reply
  • m3city
    I wonder if MS actually found AI stuff financially feasible. If not, this is going to sting when it comes to a simple .xls file with two columns: expense and income. And why not use the heat for citizens? I know solutions like this have been used in europe, at much smaller scale naturally.
    Reply
  • iEatBalut
    Just go with the GB300s. You’ll be better off in the long run. The GB200s are the biggest pieces of crap NVidia ever made.
    Reply
  • bill001g
    This smells of a AI article or someone who made no effort to research the announcements. It is even stated at the bottom "mount pleasant" which is on the other side of the state.

    Makes very little sense to put a data center out in the middle of nowhere where you can not get a electrical connections. Pleasantville wi is not really even a city it is part of a town called hale that has a population of less than 1000.

    This is more likely related to other news article you see from more reliable publication like those in milwaukee wi talking about a second data center in "mount pleasant" which is just north of milwaukee. They were using water from lake Michigan to cool the first data center.
    Reply
  • jp7189
    Wait a tick Nadella recently said they were slowing down new datacenter deployments because he expected the AI hype would cause others to overbuild and they expected to rent the extra capacity at a fraction of the cost.
    Reply
  • JC5000
    As a nuclear engineer that can use Chat GPT... 2 GW of gas power makes a woping 7,200,000 TONS of CO2 per year. Starting when the first power plant was built in 1882, it took human race 20 years to burn that much coal to produce what this does in 1 year... for a single data center. About 6 people die from lung disease per million tons so this single data center will cause 1000+ people to die prematurely over its lifetime
    Reply
  • vanadiel007
    Sometimes you have to truly admire nature. We consume food, turn it into energy, run our brain, and produce intelligence far beyond AI.
    AI that needs Gigawatts of power, hundreds of thousands of GPU's, and cannot reach intelligence.

    Nature is amazing indeed.
    Reply
  • JRStern
    Nadella said:
    If intelligence is the log of compute… it starts with a lot of compute! And that’s why we’re scaling our GPU fleet faster than anyone else.
    But that's kind of busted now, intelligence is NOT the log of compute, or it's an s-curve that hits an asymptote and we're already there. All the progress in LLMs in the last three years has been breaking that log scale down even further, coordinating large modules, and at least attempting to push off to inference time to avoid exponentials and logs on the entire universe.

    It's still exponential/logarithmic even at inference time, but now you're doing it just on Taylor Swift and not also on quantum physics and the Peloponnesian wars so you save 99.999% of the work.
    Reply
  • Arkitekt78
    This is the same MS that claims to care so much about the environment that they built this monstrosity and are forcing hundreds of millions of computers into the ewaste stream next month???
    Reply