FiringSquad gathered a dozen different GPUs

sirkillalot

Distinguished
Jan 16, 2006
1,148
0
19,310
http://www.firingsquad.com/hardware/call_of_duty_4_demo_performance/

Conclusion


The Call of Duty 4 demo gives us a brief glimpse of the gameplay and graphics we’ll be able to see in the full game when it’s released in less than a month. Based on that teaser, we can’t wait for the final version of the game.
The demo also gives us a preview of what kind of performance we can expect from the game, and here we saw the GeForce 8800 cards reigned supreme, particularly the GeForce 8800 GTX and Ultra. At 1600x1200 with 0xAA/16xAF the GeForce 8800 GTS 640MB trailed the GTX by 19%, with that gap increasing as you crank up the screen resolution. Under the increased demands of 4xAA/16xAF, the GTX pulls further away from the GTS 640MB. Here the GeForce 8800 GTS 640MB also pulls away from its 320MB counterpart. The GeForce 8800 GTS 320MB just doesn’t have enough memory to run the game at the settings we chose at high resolutions of 1920x1200 with 4xAA/16xAF. And what about AMD’s Radeon HD 2900 XT?
At worst, the Radeon HD 2900 XT ran about 9% slower than the GeForce 8800 GTS 320MB. This occurred at 1600x1200 with 0xAA/16xAF. As you increase screen resolution that gap narrows, by 2560x1600 the card pulled even with the GeForce 8800 GTS 640MB. With AA enabled, the 2900 XT runs faster overall than the GeForce 8800 GTS 320MB and performs anywhere from 12-15% slower than the GeForce 8800 GTS 640MB, although keep in mind that at 2560x1600 with 4xAA/16xAF the cards are only separated by 2.7 FPS in our testing. That’s close enough to call it even, particularly considering the variability in our benchmark runs.
Since the Radeon HD 2900 XT is priced to compete with the GeForce 8800 GTS 640MB, some may consider this a decent showing for the 2900 XT, but we think the card is being held up a little by its driver. The X1950 XTX performs awfully close to the Radeon HD 2900 XT. CrossFire scaling needs a little more work as well.
If the rumors are true, the GeForce 8800 GTS 320MB will be replaced shortly by NVIDIA’s upcoming G92 GPU. Some leaked documents suggest this card will be outfitted with more stream processors and memory than the GTS 320MB, as well as higher clock speeds: 600MHz on the graphics core (100MHz higher than the GTS today) and 900MHz memory, also 100MHz higher than the GeForce 8800 GTS. The card is expected to utilize a 256-bit memory interface versus the 320-bit interface of today’s cards, but the extra stream processors, memory, and higher speeds should be enough to offset the difference.
Of course, AMD is rumored to have a new mainstream part of their own based on the RV670 GPU. RV670 is expected to drop the 512-bit memory interface found in R600, but will carry over all of R600’s 320 stream processors and consume less power thanks to its smaller 55-nm manufacturing process.
These upcoming GPUs will make a huge impact on the mainstream segment and as such we suggest you wait if you plan on spending $200-$300 on your next graphics upgrade. These GPUs are expected to arrive sometime next month, which is just in time for the release of Call of Duty 4, and the plethora of other games coming out this year…


 

randomizer

Champion
Moderator
It just never makes sense to me how they can say something like this:

"Under 4xAA, we also begin to see the limitations of the GeForce 8800 GTS 320MB... The card just doesn%u2019t have enough memory to keep up with the 640MB... We couldn%u2019t even run the card at 2560x1600."

They conclude that they can't run it because of lack of memory? The x1950 pro has only 256mb and it manages 4fps. They really need to work around the memory bugs with that card, it has happened in several reviews where a 256mb card can run the game but the 320mb card can't.
 

justinmcg67

Distinguished
Sep 24, 2007
565
0
18,980
I liked that review. I'm not sure if they were using the Cat7.10 drivers or not, to early to see much (4:00AM) but the charts were clear enough.