Oracle has reportedly placed an order for $40 billion in Nvidia AI GPUs for a new OpenAI data center

Oracle logo on building
(Image credit: Shutterstock)

Oracle has reportedly purchased about 400,000 Nvidia GB200 AI chips, worth about $40 billion, for deployment in Abilene, Texas. According to the Financial Times, this site will be the first to host the U.S. Stargate project —a $500 billion investment in AI infrastructure by OpenAI, Oracle, SoftBank, and Abu Dhabi sovereign wealth fund MGX that President Trump announced earlier this year.

Upon completion, the site is estimated to deliver up to 1.2 gigawatts of computing power, making it one of the most powerful data centers in the world and competing against Elon Musk’s Colossus in Memphis, Tennessee.

The project’s ownership is a bit complicated, though. The Abilene site is owned by Cruso, an AI infrastructure company, and Blue Owl Capital, a U.S. investment firm. Both have poured over $15 billion into the site through debt and equity financing.

Interestingly, all these talks on equity and investments are commitments by the individual companies—the Financial Times said that Stargate itself has yet to commit any amount to a data center. Nevertheless, OpenAI recently also announced Stargate UAE, where it plans to deploy a 1GB cluster in Abu Dhabi in coordination with the U.S. government. The 5 GW data center will be built by G42 and is envisioned to utilize more than 2 million Nvidia GB200 chips.

TOPICS
Jowi Morales
Contributing Writer

Jowi Morales is a tech enthusiast with years of experience working in the industry. He’s been writing with several tech publications since 2021, where he’s been interested in tech hardware and consumer electronics.

  • A Stoner
    These kinds of sales will keep us gamers as secondary for the foreseeable future. NVidia cannot ignore these customers, which is why the 50 series pretty much only got AI enhancements that were developed while serving the AI buyers needs.
    Reply
  • Findecanor
    Computing power is not mentioned in Gigawatts. Its power-draw is.
    How will this data centre get its power?

    It wasn't long ago that Texas suffered rolling blackouts in its electricity grid.
    Grid infrastructure can take decades to prospect and build.
    Reply
  • Stomx
    Couple months ago just a week before Deepseek R1 release Larry E was advertising AI trying to convince everyone to join him to spend $500B on it. No one responded?
    Reply
  • rluker5
    $40,000,000,000/400,000 = $100,000. Is that the going rate for a GB200 nowadays? When you are buying 400,000 of them? Either Nvidia is printing money, some accountants aren't completely honest or this article doesn't have the full context.
    Reply
  • rluker5
    A Stoner said:
    These kinds of sales will keep us gamers as secondary for the foreseeable future. NVidia cannot ignore these customers, which is why the 50 series pretty much only got AI enhancements that were developed while serving the AI buyers needs.
    Somebody has figured out the equation;
    Build extremely expensive AI focused datacenters + ????? = $profit!
    Any day now we will learn what the ????? stands for.
    Reply
  • 3en88
    rluker5 said:
    Somebody has figured out the equation;
    Build extremely expensive AI focused datacenters + ????? = $profit!
    Any day now we will learn what the ????? stands for.
    Ads.
    Reply
  • lmcnabney
    TLDR - gaming GPUs will only get minimal production because Nvidia can make 10-100x the money using those wafers to make AI chips.

    So enjoy those currently sky-high GPU prices. They are going to get a lot higher as production slows to a trickle.
    Reply
  • SomeoneElse23
    Findecanor said:
    Computing power is not mentioned in Gigawatts. Its power-draw is.
    How will this data centre get its power?

    It wasn't long ago that Texas suffered rolling blackouts in its electricity grid.
    Grid infrastructure can take decades to prospect and build.
    "going green" used to be popular.

    Now the corporations don't seem to care.

    Maybe they never cared, it just sounded good?
    Reply
  • stuff and nonesense
    lmcnabney said:
    TLDR - gaming GPUs will only get minimal production because Nvidia can make 10-100x the money using those wafers to make AI chips.

    So enjoy those currently sky-high GPU prices. They are going to get a lot higher as production slows to a trickle.
    Perhaps devs could target AMD levels of performance using DX12 and not using the Nvidia special sauce. Make Nvidia irrelevant?
    Reply
  • lmcnabney
    stuff and nonesense said:
    Perhaps devs could target AMD levels of performance using DX12 and not using the Nvidia special sauce. Make Nvidia irrelevant?
    It really doesn't make much difference. There simply isn't enough fab capacity on Earth to meet the needs of the market. GPU chips are HUGE. AMD only has so much contracted capacity with TSMC. A 9950X3D needs 70 sq mm and sells for $700. The 9070XT uses 357 sq mm and requires RAM and more hardware and sells for the same - and AMD shares that money with AIB partners.
    There really is no reason for either Nvidia or AMD to make ANY GPUs. They are both fabless and every wafer they have contracts for can make them a lot more money being turned into something else.
    Reply