Best Of The Best: High-End Graphics Card Roundup

Power Consumption, Noise Levels, And Temperature Readings

Power consumption is measured in watts for the complete test platform. The 2D value comes from a normal Windows user interface at idle (Aero is also turned off to establish a truly minimal value). The 3D value is measured when both CPU and graphics card are running under heavy load (to establish a peak value). The measurement comes from the wall socket, and the efficiency rating for the power supply we used averages around 82.4% (according to its vendor).

Temperatures and Motherboard Slots
2D Temp
3D TempSlots
EVGA GTX 295 Hydro Copper (GTX 295 2x896 MB)3968Double
GeForce GTX 295 (2x896 MB)3966Double
Zotac GTX285 AMP Edition (GeForce GTX 285 1024 MB)4585Double
MSI N285GTX SuperPipe OC (GeForce GTX 285 1024 MB)4287Double
GeForce GTX 285 (1,024 MB)4585Double
MSI N280GTX OC HydroGen (GeForce GTX 280 1024 MB)3451Single
GeForce GTX 280 (1,024 MB)4586Double
BFG GeForce GTX 275 (GeForce GTX 275 896 MB)4792Double
GeForce GTX 275 (896 MB)4792Double
GeForce GTX 260 216SPs (896 MB)4581Double
GeForce GTX 260 (896 MB)4890Double
GeForce 9800 GTX+ (512 MB)4881Double
GeForce 9800 GTX (512 MB)5574Double
Radeon HD 4890 (1,024 MB) 9.46080Double
Radeon HD 4870 X2 (2x1,024 MB)4979Double
Radeon HD 4870 (512 MB)6074Double
Radeon HD 4850 (512 MB)7994Single
Radeon HD 4770 (512 MB)5272Double
Power Consumption
2D Watts3D WattsTDP WattsPower Connectors
EVGA GeForce GTX 295 Hydro Copper (GTX 295 2x896 MB)1885322891 x 6 + 1 x 8 pin PCIe
GeForce GTX 295 (2x896 MB)1884682891 x 6 + 1 x 8  pin PCIe
Zotac GTX285 AMP Edition (GeForce GTX 285 1,024 MB)1503561832 x 6 pin PCIe
MSI N285GTX SuperPipe OC (GeForce GTX 285 1,024 MB)1563481832 x 6 pin PCIe
GeForce GTX 285 (1,024 MB)1503481832 x 6 pin PCIe
MSI N280GTX OC HydroGen (GeForce GTX 280 1,024 MB)1633582361 x 6 + 1 x 8 pin PCIe
GeForce GTX 280 (1,024 MB)1553472361 x 6 + 1 x 8 pin PCIe
BFG GeForce GTX 275 (GeForce GTX 275 896 MB)1563552192 x 6 pin PCIe
GeForce GTX 275 (896 MB)1563512192 x 6 pin PCIe
GeForce GTX 260 216SPs (896 MB)1502951822 x 6 pin PCIe
GeForce GTX 260 (896 MB)1543301822 x 6 pin PCIe
GeForce 9800 GTX+ (512 MB)1652771412 x 6 pin PCIe
GeForce 9800 GTX (512 MB)1702781562 x 6 pin PCIe
Radeon HD 4890 (1,024 MB)
1823121902 x 6 pin PCIe
Radeon HD 4870 X2 (2x1,024 MB)2344652861 x 6 + 1 x 8 pin PCIe
Radeon HD 4870 (512 MB)1912881572 x 6 pin PCIe
Radeon HD 4850 (512 MB)1662701141 x 6 pin PCIe
Radeon HD 4770 (512 MB)152199801 x 6 pin PCIe
Noise Levels Measured
2D dB(A)3D dB(A)Fan slots
Fan diameter
EVGA GeForce GTX 295 Hydro Copper (GeForce GTX 295 2x896 MB)Water cooler
Water coolerDoubleWater cooler
Zotac GTX285 AMP Edition (GeForce GTX 285 1,024 MB)37.951.4Double75 mm
MSI N285GTX SuperPipe OC (GeForce GTX 285 1,024 MB)36.138.7Double2 x 65 mm
GeForce GTX 285 (1,024 MB)37.951.4Double75 mm
MSI N280GTX OC HydroGen (GeForce GTX 280 1,024 MB)Water coolerWater coolerSingleWater cooler
GeForce GTX 280 (1,024 MB)38.045.4Double75 mm
BFG GTX 275 (GTX 275 896 MB)36.844.2Double75 mm
GeForce GTX 275 (896 MB)36.844.2Double75 mm
GeForce GTX 260 216SPs (896 MB)37.541.2Double75 mm
GeForce GTX 260 (896 MB)37.853.8Double75 mm
GeForce 9800 GTX+ (512 MB)36.641.4Double73 mm
GeForce 9800 GTX (512 MB)37.244.8Double70 mm
Radeon HD 4890 (1,024 MB)
36.748.4Double73 mm
Radeon HD 4870 X2 (2x1,024 MB)51.260.4Double73 mm
Radeon HD 4870 (512 MB)38.049.4Double73 mm
Radeon HD 4850 (512 MB)36.247.9Single60 mm
Radeon HD 4770 (512 MB)36.338.5Double70 mm
Create a new thread in the US Reviews comments forum about this subject
This thread is closed for comments
111 comments
    Your comment
    Top Comments
  • Anonymous
    Only one ATi card? What happened to all those OC'd 4890s?
    23
  • Anonymous
    Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
    16
  • curnel_D
    Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
    10
  • Other Comments
  • Anonymous
    Only one ATi card? What happened to all those OC'd 4890s?
    23
  • Anonymous
    And those HAWX benchmarks look ridiculous. ATi should wipe floor with nvidia with that. Of course you didn't put dx10.1 support on. Bastard...
    8
  • cangelini
    quarzOnly one ATi card? What happened to all those OC'd 4890s?


    These are the same boards that were included in the recent charts update, and are largely contingent on what vendors submit for evaluation. We have a review upcoming comparing Sapphire's new 1 GHz Radeon HD 4890 versus the stock 4890. It'll be up in the next couple of weeks, though.
    1
  • ohim
    Am i the only one that find this article akward since looking at the tests done on Ati cards on The Last Remnant game makes me wonder what went wrong ... i mean it`s UT3 engine ... why so low performance ?
    4
  • curnel_D
    Ugh, please tell me that The Last Remnant hasnt been added to the benchmark suite.

    And I'm not exactly sure why the writer decided to bench on Endwar instead of World In Conflict. Why is that exactly?

    And despite Quarz2's apparent fanboism, I think HAWX would have been better benched under 10.1 for the ATI cards, and used the highest stable settings instead of dropping off to DX9.
    10
  • anamaniac
    The EVGA 295 is the stuff gods game with.

    I would love that card. I would have to replace my whole system to work it properly however.
    I want $1500 now... i7 920 (why get better? They all seem to be godly overclockers) and EVGA 295.

    How about a test suit of the EVGA GTX 295 in crossfire for a quad-gpu configuration? I know there's driver issues, but it would be fun to see what it could do regardless. Along with seeing how far Toms can OC the EVGA GTX 295.
    Actually... Toms just needs to do a new system building recommendation roundup. I find them useful personally, and would have used it myself had my cash source had not lost his job...
    -12
  • Anonymous
    Weird test:
    1) Where are the overclocking results?
    2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
    3) 1900x1200 as highest resolution for high-end cards?
    16
  • EQPlayer
    Seems that the cumulative benchmark graphs are going to be a bit skewed if The Last Remnant results are included in there... it's fairly obvious something odd is going on looking at the numbers for that game.
    4
  • armistitiu
    Worst article in a long time. Why compare how old games perform on NVIDIA's high end graphic cards? Don't get me wrong i like them but where's all the Atomic stuff from Saphire, Asus and XFX had some good stuff from ATI too. So what.. you just took the reference cards from ATI and tested them? :| That is just wrong.
    9
  • pulasky
    WOW what a piece of s********** is this """"""review"""""" Noobidia pay good in this days.
    -7
  • darkpower45
    ok i tried playing The Last Remnant on my comp with my 4870x2 and it failed hardcore >.< the game itself is ridiculously boring too. sooo why is it added to the benching list?? *shakes head* makes me sad...
    -1
  • guusdekler
    I find it a lack this tests do not include the 3DMark Vantage suite.
    Ok, there aren't many games using DX10, but some very good ones do !.

    Thats the reason i've switched to vista.

    And with me enough people to justify a proper DX10 benchmark.
    1
  • Luscious
    No mention of the GTX 285 2GB version? I'm planning on picking up three of these for a tri-SLI Core i7 build, all water-cooled and overclocked.
    -6
  • Ellimist
    well i'm running an factory overclocked gtx285. only because i like solid drivers and DAAMIT doesn't seem to be able to provide these consistently. thats been my biggest problem in picking up an ATI card.

    This review however is terrible. the benchmark selection is dated if nothing else. even toms other reviews of recent have used better benchmarks than this.
    -9
  • sosofm
    This benchmark is not fair for Ati !!!
    10
  • IronRyan21
    Lets see some 3dmarkVantage pls
    3
  • drealar
    Like car review magazine (like the one my friend is working for), I THINK they only have cards that were submitted to them and (not sure if this is the case with Tom's) they're only lended for a limited amount of days.

    Although I'm not very satisfied (coz lack of ATI card in your possession), I thank you for the review with Fallout, Left 4 Dead and Last Remnant with DX9. Yup I'm still using XP coz the bog-down symptom with Vista is too noticeable for my rig.

    1920x1200 as minimum threshold? Cool, as my 23" is limted to 1920x1080 anyway :P
    0
  • JeanLuc
    I think you guys should cut Tino Kreiss some slack this I believe his first publication? Saying things like "this is the worst article I've read in a long time" doesn't actually help. You can blame the choice of benchmarks suites on sites manager/editor not the author as he only does what he is told to write. So with that in mind............Cangelini your fired.

    I am curious though, HAWX is a game sponsored by ATI so why is the HD4890 getting it's backside tanned by the GTX275? It's not just a few FPS behind either the difference is quite remarkable and yes I do realise the BFG GTX275 is overclocked but it's not overclocked by a lot.
    1
  • cangelini
    JeanLucCangelini your fired.


    Perhaps you'll hire me as a copy editor for your posts instead? ;-)

    In all seriousness, Tino has been with Tom's German office for a long time. I've asked the staff responsible for testing there to drop in and provide some feedback on the products and benchmarks used here.

    Best,
    Chris
    0
  • linaaslt
    How lame this article is... i was always wondering why they don't use full potential of gpu, if ATI is capable of using DX10.1 (and game uses that technology), why not use it. it might not be fair for nvidia but ffs, i believe that this kind of review should show all potential of products.
    Shame for TH!
    9