Nvidia Reveals RTX 6000 With 48GB GDDR6 ECC Memory

In a rather unexpected move, Nvidia introduced its new flagship graphics card for the professional visualization market and its new GeForce RTX 40-series graphics boards for gamers. The new Nvidia RTX 6000 48GB carries almost the name of its predecessor, yet this is an all-new Proviz solution based on the Ada Lovelace architecture.

Nvidia says that its RTX 6000 graphics card with 48GB of GDDR6 ECC memory onboard will offer up to 2X – 4X the performance of the previous-generation RTX A6000 due to the massively increased number of CUDA and RT cores as well as the brand-new Ada Lovelace architecture.

"The RTX 6000 GPU's larger L2 cache, a significant increase in number and performance of next-gen cores, and increased memory bandwidth will result in impressive performance gains for the broad Ansys application portfolio," said Dipankar Choudhury, Ansys Fellow and HPC Center of Excellence lead.

Nvidia said its new flagship graphics board for workstations would be available this December. Unfortunately, the company did not reveal the price of its new RTX 6000 48GB card. But based on the fact that Nvidia sells its previous-generation RTX A6000 48GB graphics card for an MSRP of $6,999, we can make some guesses about the company's recommended price for the new one that implies higher performance.

Anton Shilov
Contributing Writer

Anton Shilov is a contributing writer at Tom’s Hardware. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • Hahahaha. Can’t believe people think it’s actually worth the price for video cards. Have fun paying through the nose Suckers
    Reply
  • TJ Hooker
    Mandark said:
    Hahahaha. Can’t believe people think it’s actually worth the price for video cards. Have fun paying through the nose Suckers
    Nvidia's workstation cards (formerly known as Quadro) have always been expensive AFAIK. If companies keep buying them I can only assume the cards pay for themselves through increased productivity, at least for certain professional workloads.

    Edit: Or the certified pro drivers are a requirement for their work, and they presumably factor in the cost of the pro graphics cards when billing clients.
    Reply
  • Casper42
    Dumb Name.
    The RTX 6000 was a Turing card, not that old.
    The Ampere card was the RTX A6000
    And they already announced the L40 which is usually the DataCenter cousin to the top "Quadro".
    So why not just call this the RTX L4000?
    Reply
  • Phyzzi
    Mandark said:
    Hahahaha. Can’t believe people think it’s actually worth the price for video cards. Have fun paying through the nose Suckers
    People buying these cards in particular aren't using them for gaming, they are using them for rendering movies or simulating black holes or rapid AI training. At that point, the options are to rent space on a bigger computer, which is also expensive and a hassle, or get one or a few of these cards, which might be cheaper over the long haul and offers a lot more flexibility. It's not stupid, it's practical.
    Reply
  • Exploding PSU
    Regardless of price, function, or uses, I find that's a badass looking card. I'd love to have one of these inside my PC.
    Reply
  • MicroCenterIsPrettyGood
    Mandark said:
    Hahahaha. Can’t believe people think it’s actually worth the price for video cards. Have fun paying through the nose Suckers
    It's not for gaming. You're paying a CFD engineer $150,000 a year to make flow geometry decisions on a $2 million product design. If you can make him 10% more productive by cutting down his render times, you've made back the price of the card in 3 months. Half that if he hot-desks.
    Reply
  • Phyzzi
    exploding_psu said:
    Regardless of price, function, or uses, I find that's a badass looking card. I'd love to have one of these inside my PC.

    I admit, I will probably pay through the nose for one of these, but I won't pay a scalper for one.
    Reply