Skip to main content

Custom GeForce RTX 3090 Ti Features Quad-Slot Cooler

Nvidia
(Image credit: VideoCardz)

Nvidia's GeForce RTX 3090 Ti promises to vie for a spot on the best graphics card for gaming when it becomes available in a week or two, but this card will require a lot of power and a lot of cooling. Early pictures of GeForce RTX 3090 Ti graphics cards from Nvidia's partners show massive cooling systems that are 3.5 – 4 slots wide.

Nvidia's GeForce RTX 3090 Ti reportedly uses a fully-fledged GA102 GPU with 10,752 CUDA cores (up from 10,496 on the GeForce RTX 3090) mated with 24GB of GDDR6X memory running at 21Gbps across a 384-bit interface. The unit has all the chances to become the world's fastest graphics card, but this also means a rather extreme power consumption. Reports indicate that GeForce RTX 3090 Ti will have a recommended thermal board power (TBP) of around 450W. Still, some of Nvidia's partners may go as high as a whopping 480W (it is better not to think how much such a board will cost and keep in mind that Nvidia's current GeForce RTX 3090 has a power rating of 350W).

But 450W – 480W or thermal power needs some excellent cooling, and with air cooling, things get complicated. Nvidia's GA102 is a highly complex GPU, and with its 28.3 billion transistors, it wants power. But GDDR6X memory with its PAM4 encoding also consumes a lot of power, so the card emits vast amounts of heat.

This situation is where rather large cooling systems come into play. VideoCardz (opens in new tab) has gathered images of Nvidia's GeForce RTX 3090 Ti partners cards (including those from Colorful and EVGA); they are either 3.5 – 4 slots wide or hybrid liquid cooling.

A quad-slot cooling system is, without any doubt, remarkable. Yet it reminds us of some other graphics subsystems we have encountered before.

The year was 2006, and the battle for the best graphics subsystem between ATI Technologies and Nvidia was raging. It was then possible to increase graphics performance by adding GPUs into a subsystem, so both Nvidia and ATI introduced their SLI and CrossFire 2-way GPU technologies. But this was not enough for Nvidia, which thought there was some space for ultra-high-end graphics subsystems involving four GPUs.

To make those 4-way SLI possible, Nvidia developed its dual-GPU GeForce 7900 GX2 (for PC OEMs, for regular users, it followed with GeForce 7950 GX2). Quad SLI did not work correctly (yet the interest towards that setup was extremely high), but it set the tone for extreme performance and power consumption.

Nvidia's GeForce 7900 GX2 consumed some 110W of power and required a hefty power supply unit (by the standards of 2006), and two of such cards needed 220W of power. Today, a single chip with its top configuration running at high clocks can use some 450W or more. Funny, this power consumption with a single graphics processor is comparable to that of four GPUs back in the day.

Then again, Nvidia's GA102 is faster and more capable than nearly every GPU designed. So maybe it is worth it?

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

  • DRagor
    If it looks like brick and weights as brick ... is it brick? :)
    Reply
  • btmedic04
    yikes evga. you shouldve gone with the 3 slot rear io connector like you had on the rtx 2xxx series. i bet the sag is going to be baaaaad
    Reply
  • Neilbob
    I'm sure there are going to be a whole bunch of people who pair this (or something similar) with a 12900KS and then proceed to say 'look at how big I am'.

    It's that last 5% that makes all the difference, of course.
    Reply
  • Alvar "Miles" Udell
    And there's people like me who think it should have been liquid cooled. If you're paying that much for a GPU another $100 for a liquid cooling system is nothing.
    Reply
  • jacob249358
    it seems like they are trying to bridge the gap between rtx 3000 and 4000
    Reply
  • -Fran-
    I'm looking forward to the reviews (if nVidia dares) and the following mental gymnastics from the nVidia cult.

    Regards xD
    Reply
  • exploding_psu
    That thiccness of the GPU reminds me of old Chinese GPUs like when Colorful tried to passively cool a GTX 680 or when Yeston got their hands on the 7970. It was massive.
    Yes, I was a sucker for Chinese GPUs back then, probably still am

    EDIT : added some techpowerup link
    Reply
  • InvalidError
    btmedic04 said:
    yikes evga. you shouldve gone with the 3 slot rear io connector like you had on the rtx 2xxx series. i bet the sag is going to be baaaaad
    Since it is a 4-high HSF, it should probably have a 4-wide IO bracket with appropriate mechanical connection to the heatsink for support. A 4-wide bracket wouldn't do much good for support if it is only held onto the GPU by two PCB screws.
    Reply
  • jp7189
    InvalidError said:
    Since it is a 4-high HSF, it should probably have a 4-wide IO bracket with appropriate mechanical connection to the heatsink for support. A 4-wide bracket wouldn't do much good for support if it is only held onto the GPU by two PCB screws.
    It also would be nice to have vents spread across 4 slots to help get heat out of the case.
    Reply
  • jp7189
    Thanks for the trip down memory lane. It's funny how marketing can warp my mind over the years. I have a distinct memory of thinking the multi-GPU cards were outlandishly ridiculous for cost and power consumption ($600 , 110watts). Today a top tier halo card for $600 @110watts would be too good to be true.
    Reply