I was thinking about overclocking the 6600GT SLI I've got here since the next-gen games are getting a bit heavy on it. It's a single card; I'm not using the SLI option. There was no difference between the normal and the SLI version price-wise at my hardware store, so I picked the SLI version.
I got CoolBits and it says that my card is clocked at 500mhz core and 2ghz memory. 2ghz? Isn't that a bit much? I thought it was clocked around 1ghz stock? Anyways, I reject my warranty and I figure to let CoolBits determine optimal frequencies so I can start from there. I get something like 557mhz and 2,1ghz. Great! I press the "test" button and it fails. No problem, I reduce the clocks and... it fails again. Mmm. I set the frequencies to 501/2,001 and... another failure. So much for CoolBits. I disable it (and regain my warranty in the meantime; lmao).
I get RivaTuner and reboot the system to determine the clock frequencies. It comes up with a 500mhz core and 1ghz memory. That's a lot more likely, but why did CoolBits say 2ghz then? Which one of them is right?
Anyway, I start overclocking and I put them at 550 and 1,1. Reading articles on overclocking the 6600, I believe that those frequencies shouldn't be too much of a problem. I fail the RivaTuner tests so I decide to disable the test and see what happens. I apply the changes and I'm off.
Next, I start a game and in a minute or two I get purple artifacts flickering everywhere. The fps increase is only a 5-10fps, I'd be happy with that if only those artifacts weren't there. I check the temperature of the card through the ForceWare menu and it's at a reasonable 50-60°. I decided to go back to stock settings and write this post here.
So that's my first attempt at overclocking a GPU. I'm using stock cooling right now, but I don't think that should be a problem since the 60° isn't that hot for a 6600GT SLI. For the record, I'm not interested in getting additional cooling right now.
Can somebody explain to me what I'm doing wrong here? It's my first time overclocking a GPU and I'm eager to learn. I know I should go up in increments instead of going a immediate +50/100, but lesser frequencies aren't interesting if the +50/100 only give a 5-10fps increase.
No two cards will be the same, some OC more than others some not at all (7900GTO's spring to mind) the reason 'Coolbits' reports 2Ghz of GDDR depends on the driver version you are using, on 84.21 I saw single rate on 91.31 I see single rate but on the latest one I see double rate. Who made your card? and could you swap it for different make?