Sign in with
Sign up | Sign in
Your question

G80 needs fastest CPU article.

Last response: in CPUs
Share
November 29, 2006 8:09:28 PM

http://www.tomshardware.com/2006/11/29/geforce_8800_nee...

On this page, the FX60 get's better FPS at a certain resolution what can you guys make of it, how's is that possible or why. Thanks.
Anonymous
a b à CPUs
November 29, 2006 8:21:18 PM

I don't know why but see how close the 1600 and 2048 are with AMD, they're too staggered IMO. But again who knows.

Funny how good AMD perform Outdoor and how it vanishes indoor.

Anyway, hard to tell, especially since THG tell us none of the in game settings...
November 29, 2006 8:23:09 PM

Do you think is because of the IMC since it has somewhat better bandwith?
Related resources
Anonymous
a b à CPUs
November 29, 2006 8:31:01 PM

The IMC is more about latency then bandwith, but yes that could be a cause.

Usually the games that benefit more from a lot of cache(on intel Arch) do benefit a lot from the ODMC(IMC).

Still I find it weird, if it benefit from some architectural feature of the FX it should show across the board, no only at this resolution.
November 29, 2006 9:04:54 PM

That's exactly what i was thinking. Is like the system hit a speed bump at that resolution for a nano second and screw stuff up. I have not idea if they tried, but maybe they should have run that benchmark twice or a few more times. I am not implying that Intel should have won, is just add, that at higher res AMD takes more ground and wins one, but then the next higher res it looses?
November 30, 2006 12:43:11 AM

Significant differences between the two systems, other than CPUs, in order of probable significance:

- Memory: DDR @ 400 (CL3.0-4-4-8) vs DDR2 @ 800 MHz (CL5-5-5-15)

- Chipset: nForce4 vs Force 680i

- CPU interface: 1.0 GHz HT-Link vs 1,066 MHz FSB


IMC being more about latency than bandwidth sounds like a very likely cause... but if you compare the results on the 8800 GTS, the tables turn so I suspect it may have more to do with the tricky nature of benchmarking Oblivion or other margin of error factors than anything particularly telling.


IMO, the article actually supports the old adage that the CPU isn't very important for gaming. While the frame rates at 1024x768 shot up on the Conroe, they were all well over 100 anyway... except in the one test where FX-60 was able to keep up. I haven't scanned all of the results thoroughly, but I doubt there's any case where upgrading to the Conroe affected the playability of the game.

This doesn't totally shoot down CPUs though. There is the belief (that I personally agree with) that while CPUs may not be that important for average frame rates, they are critical for "smooth" gameplay. I'd be very curious to see the results with only minimum frame rates displayed. Also, even with averages shown, the FX-60/8800GTX/outdoor Oblivion test was clearly CPU limited at 1024x768 without clearing the 60 fps mark. This suggests that CPU usage isn't that far behind GPU usage and that much of the next generation of games will take advantage of a Conroe.

EDIT: All that said, I'm very happy with my 30% overclocked 4200+ and 8800 GTX combo and feel no need to upgrade my CPU.
!