Nvidia GeForce 9800 GT And ATI Radeon 4870 Get A 225 Watt TDP

 

San Francisco (CA) - The battle for mainstream graphics supremacy will enter a new phase with month, with both Nvidia and AMD’s graphics unit ATI being expected to introduce their latest contenders within two weeks. And if power supply is any indication, then both products could be swimming against the green IT trend and turn out to very power hungry.

Nvidia’s and ATI’s next graphics cards may be vastly different overall, but they will share some features such as 512 MB of memory (GDDR3 on the GeForce, GDDR5 on the Radeon) as well as power and cooling requirements. We recently learned that the GeForce 9800GT and Radeon 4870, both of which will aim at the sub-$300 segment, will come with two 6-pin PEG (PCI Express Graphics) connectors, each supplying 75 watts of power.

With PCI Express slot providing 75 watts of juice and the motherboard another 75 watts, there is a theoretical supply of up to 225 watts. That is quite a jump in the sub-$300 segment and it makes you wonder what the reasons may be, but we are certain that overclockers won’t mind. One of key limitations for the extreme overclocking of previous generation parts was the fact that the cards (3850, 3870, 8800GT, 8800GTS512) had only one power connector.

And the reason why ATI was so popular with overclockers was the fact that you could tell the motherboard to supply 150 watts, and ATI’s 3870 would eat 150 watts from the motherboard (total: 75 watts + 150 watts = 225W). With two power connectors in place for the 4850 and 4870, overclockers do not need any hacks to get access to 225 watts. In fact, the 4870 will top out at 300 watts (150 watts from the motherboard, 75+75W from connectors). The 9800GT will be restricted to 225 watts (75 watts from the motherboard).

Overclocking numbers for the Radeon 4870 have been surfacing on the web already and we are interested to see how the 55nm G92b chip will compete against ATI’s 55nm RV770XT GPUs. Actually, RV770XT is now looking more and more like a competitor for the GTX 260 and 280 - with a much lower price tag. Keep in mind that you will be able to buy two 4870 boards for the price of a single GTX280.

  • homerdog
    Just because the cards can be supplied with 225W does not mean they have a 225W TDP. I don't know much, but I do know this.
    Reply
  • 4870 is not supposed to compete with the GTX 200 series... at most it will try to compete with the GTX 260... but the GTX 280 will crush it... the 4870 X2 I think will beat the GTX 280, I thought differently... but... Idk... it just seems ATI is taking a step in the right direction while NVIDIA is still concentrating on one massive killer chip... when 2 smaller medium chips would win... but w/e
    Reply
  • terror112
    But that will only happen if the game supports SLI, or Crossfire. But since most new games do now, that problem has been ruled out. I just cant wait for the upcoming series.. it's been too long..
    Reply
  • fulle
    I'm too paranoid about micro stutter to really want SLI or Crossfire. While SLI/Crossfire may have higher frame rates, this may not be perceived by the eye to be more fluid... and at the end of the day, that's what matters.

    More on subject, I really hope these cards aren't requiring that sort of wattage to run at stock speeds. Sounds hot, and unnecessary on a 55nm midrange GPU.
    Reply
  • LAN_deRf_HA
    Ati claims to have fixed the micro stuttering issue. With the price, performance, and dx10.1 thing (which we've seen does improve performance) its hard for me to see the point in owning one of nvidia's new cards. Sadly I know people will still buy more of their cards due to how long ati would have to be on top for the news to filter down to the average buyer, who do not research their products.
    Reply
  • blppt
    I long for the days when a product like PowerVR2 could come out with a good price point, and high efficiency (due to the tiling architecture), with relatively low power consumption. Too bad they were never able to design a hardware T&L engine to work with the PowerVR architecture.

    Pretty soon we are all going to need small nuclear reactors in our cases to power these "brute force" wonders from ATI and nVidia. What happened to design elegance?
    Reply
  • fulle
    @blppt
    Interesting someone brought up PowerVR2... I actually owned a KyroII back in the day... and a Dreamcast. Good concepts there, like not "wasting fillrate and texture bandwidth on pixels that are not visible in the final image." TBDR, and HSR were good ideas that squeezed more performance out of otherwise inferior hardware.

    I wish Nvidia and ATI/AMD would try to do things more efficiently, rather than try to figure out how to feed their beasts more watts. I suppose this is still great news to overclockers though.
    Reply
  • Lozil
    Good to see two Graphic Giants Fight... The Benefit will surely go to customers...
    BTW GDDR5 for ATI and GDDR3 for Nvidia gonna make some difference...?

    http://free-and-useful.blogspot.com
    Reply
  • hellwig
    I'm not up to date on motherboards and PCIe, but how do a motherboard and PCIe slot supply 75W individually to a card plugged directly into the PCIe slot (which resides on the motherboard). I was pretty sure the 75W PCIe was the total supplied by the board. If the board itself supplies an additional 75W, where does this come from? How does this 75W get to the board if not through the PCIe slot (which is apparently already suppying its own 75w)?
    Reply
  • apache_lives
    GDDR5 and GDDR3 - all comes down to power consumption, max clock speeds of memory etc, not raw performance, altho the higher clock speed benefit will give the gpu more bandwidth and can lead to more performance etc but that alone will do nothing.
    Reply