I have a problem with my Gigabyte 9800GT (N98TOC-512H) graphics card.
My spec: MSI-P35-NEO2-FR (MS-7345) with a 3.33GHZ Core2Duo and 2x2GB 1066MHZ HYPERX DDR2 rams.
I play games in 1280x1024@125HZ (21" EIZO CTR), so my spec should be enough for most of the online FPS games in medium-high details.
My problem is that if I OC my card anything beyond the default settings, I get lower fps in game benchmarks (like in RE5, HL2 lost coast,etc) I tested about 7-8 games, but all gave me lower fps (3-4 less) even @ 800/2200/1100.
It really doesn't matter if I just OC the rams or only the shaders or everything together, and also doesn't matter if I OC it with 1% or 20%, I only get better results in test like FURMARK or 3DMARK06, but not in games..
I used to work with computers in the last 3 decades, but never saw anything like this before, you can measure the higher performance but not in games.
If overclocked too far you will get errors. But the new GPU cards have memory error correction. This takes resourses to process.
So you are overclocking but getting errors which negate the overclocking, and actually slows things down overall.
The error correction only kicks in while gaming? If you read my post, you know that I get higher results in synthetic benchmarks (just like in the review I linked, I get scores like the factory default 4870, but . I think I will start testing different driver versions to continue the investigation.