We used an Intel Core i7-920 CPU for our testing purposes, boosting its clock speed to 3.8 GHz in order to circumvent any potential CPU bottlenecks.
This time, our technical tables are more comprehensive to help us detail how our retail test cards differ from the clock rates for standard reference cards. You'll find three different generations of manufacturing technology represented here as well: 65, 55, and 40 nm all exert influence on maximum power consumption and operating temperatures for graphics chips. These tables also illustrate how overclocking on specialty models boosts fill rates and raw computing power for their graphics chips.
Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.
quarzOnly one ATi card? What happened to all those OC'd 4890s?
These are the same boards that were included in the recent charts update, and are largely contingent on what vendors submit for evaluation. We have a review upcoming comparing Sapphire's new 1 GHz Radeon HD 4890 versus the stock 4890. It'll be up in the next couple of weeks, though.
Am i the only one that find this article akward since looking at the tests done on Ati cards on The Last Remnant game makes me wonder what went wrong ... i mean it`s UT3 engine ... why so low performance ?
I would love that card. I would have to replace my whole system to work it properly however.
I want $1500 now... i7 920 (why get better? They all seem to be godly overclockers) and EVGA 295.
How about a test suit of the EVGA GTX 295 in crossfire for a quad-gpu configuration? I know there's driver issues, but it would be fun to see what it could do regardless. Along with seeing how far Toms can OC the EVGA GTX 295.
Actually... Toms just needs to do a new system building recommendation roundup. I find them useful personally, and would have used it myself had my cash source had not lost his job...
1) Where are the overclocking results?
2) Bad choice for benchmarks: Too many old DX9 based graphic engines (FEAR 2, Fallout 3, Left4Dead with >100FPS) or Endwar which is limited to 30FPS. Where is Crysis?
3) 1900x1200 as highest resolution for high-end cards?
Seems that the cumulative benchmark graphs are going to be a bit skewed if The Last Remnant results are included in there... it's fairly obvious something odd is going on looking at the numbers for that game.
Worst article in a long time. Why compare how old games perform on NVIDIA's high end graphic cards? Don't get me wrong i like them but where's all the Atomic stuff from Saphire, Asus and XFX had some good stuff from ATI too. So what.. you just took the reference cards from ATI and tested them? :| That is just wrong.