I've recently upgraded to a point of view 8800GTX 600/1900. running 3dmark06 i notice that my score is a little low 10200 ish, CPU score is comparable to other setups as is my SM2 score. But my SM3.0 score is low.
SM2 = 4821
SM3 = 4938
I'm expecting SM3 to be about 5500 maybe more (based on comparisons to other 800GTX's in ORB) I doubt its the clock speed of the card as the SM2 scores are fine.
2Gb OCZ DDR2 at 400Mhz
8800GTX - 600/1900 (no OC)
P5B deluxe wi/fi
Seasonic S12 600W
Looking at the FPS in the SM3 tests I'm seeing a drop from 75 for others to 50 for mine.
Temps are all ok, both CPU (50's under load) and GPU (70's in test, 80's under real load)
Expecting 5500 and you got 4938. I wouldn't worry at all as long as you are getting good performance in your games. Remember, 3Dmark06 is a benchmarking tool and does not accurately reflect whether or not gameplay will be good or not. It is pretty close, but sometimes you can get a good score, but have terrible gameplay. If the fps dips that low in 3Dmark06 only, I wouldn't worry.
The only problem with that is that the games I have either play at FPS that are to high to be of benefit (half life 2 > 150fps) or they are good, but could be better, crysis demo for instance, 30-40Fps everything on high. But the extra 5-10 would be useful at 35-50 would be nicer, and possibly noticable.
Hence turning to something that can actually give a measureable effect, and I'm trying to understand why one 8800GTX could give a result of 5500 ish, and another 4800 ish. What is it about my set-up that determines this? Lets just say its my need to know?
Of course I could be wrong, what are other 8800GTX users with E6600 @stock (or similar) getting?