Sign in with
Sign up | Sign in
Your question

8800GTS G92 overclocking: Your speed and gains

Last response: in Graphics & Displays
Share
a b Î Nvidia
February 6, 2008 11:19:48 AM

So all of you 8800GTS G92 owners and OCers. What have you gotten out of your OC's and what speeds are you running at? I am just curious. So far I have only compared 3dMark05 results but not actual games. I haven't had time yet.

Stock 650MHz GPU in get around 19,100
OC to 725MHz I get about 19,300
OC to 750Mhz I get about the same

What gives? or is 3Dmark05 to old to really show the glory of the overclock? Thoughts?
February 6, 2008 4:43:28 PM

I OC'd mine to something around 780/1200 and it was stable, but gave almost no increase in performance (1-2 fps in Crysis, 5-7 other games, ~150 3d06 marks). So I just put it back to EVGA's stock clocks. Wasn't worth it to me.
February 6, 2008 4:58:49 PM

I agree, Im running at 750/1050 with no real performance increase that I can detect.
Related resources
a b Î Nvidia
February 6, 2008 5:57:45 PM

This is what I am noticing. I have the XFX vanilla card at 650/792. Does this mean the XFX XXX card that cost $50 and run 678/792 really isn't performing much better than the stock clocked cards?

My x1900xtx really showed some performance when OCed and really noticed the differences. With this one I am not. That is why I asked.
February 6, 2008 6:15:43 PM

I think so. I wouldnt spend extra for an OC edition of this card. I imagine he 678/792 OC would be reliable on any reference card, no matter how poor the batch is.

I am running the EVGA version with stock clocks, btw.

Now, I DID get the card all the way up to 800/1200 and noticed Crysis played better, for about 1 min, then it locked up. The temps were fine, but I dont think the memory liked running at an effective 2400mhz.
a b Î Nvidia
February 6, 2008 6:49:07 PM

I also got the card to run at 800/972 and got through 3dMark05. the results went from 19,110 stock to 19,550 so a little over 400 points for a clock that isnt even stable without mad cooling. I ran the ATI Tools artifact test and got tons of dots. It wasn't stable in any games either.

So 440 points in 3dMark05 takes a 150MHz overclock to achive? I'm not sold on it. I thought there was something wrong with what I was doing but guess not.
February 6, 2008 7:05:15 PM

perhaps other limitations in the OS or other hardware is a factor in how much gain we see?

How much performance gain did people see from overclocking their 8800 Ultra?
February 6, 2008 7:08:41 PM

Use 3DMark06.
a b Î Nvidia
February 6, 2008 7:20:40 PM

ZOldDude said:
Use 3DMark06.


I only got this yesterday and was messing around. I ran 06 over lunch and got 12,xxx something. I will try the OC and see what it does with that. It sounds like people only get 1 or so frames more out of games so is it really a benefit?
February 6, 2008 7:53:29 PM

30fps is better than 29fps.....

so there is a benefit. ;) 
a b Î Nvidia
February 6, 2008 9:41:38 PM

rallyimprezive said:
30fps is better than 29fps.....

so there is a benefit. ;) 


But is it work such a heavy OC to get 1 frame.

BTW i tested with 3Dmark06

stock = 12,400
750/1000 - 13,100

There is a slight showing there but i want to play with stock for now to get a feel for what games run at.
February 7, 2008 1:40:16 AM

Yeah, I only improve slightly, too, in 3dMark. I have noticed a difference in games though.
a c 171 Î Nvidia
a b K Overclocking
February 7, 2008 10:52:11 AM

rallyimprezive said:
perhaps other limitations in the OS or other hardware is a factor in how much gain we see?

How much performance gain did people see from overclocking their 8800 Ultra?

Nail head hit on you have.
a b Î Nvidia
February 7, 2008 11:06:13 AM

I wonder if I have a PCI-E 2.0 Mobo if the overclock would give me more... I will run stock for a little with the OSD and then overclock later on and see if I notice improvement. I don't care if 3Dmark gains just experience in games.
a c 171 Î Nvidia
a b K Overclocking
February 7, 2008 11:29:47 AM

jay2tall said:
I wonder if I have a PCI-E 2.0 Mobo if the overclock would give me more... I will run stock for a little with the OSD and then overclock later on and see if I notice improvement. I don't care if 3Dmark gains just experience in games.

Your 'lowly' ( I can't believe I called it that! :o  ) e6400 with its 'slow' and 'old' 1066 FSB could also be a fly in this particular ointment. Have you tried running 3d mark with the CPU at standard clocks? and if so what were the results. [:mousemonkey:2]
a b Î Nvidia
February 7, 2008 12:00:48 PM

Mousemonkey said:
Your 'lowly' ( I can't believe I called it that! :o  ) e6400 with its 'slow' and 'old' 1066 FSB could also be a fly in this particular ointment. Have you tried running 3d mark with the CPU at standard clocks? and if so what were the results. [:mousemonkey:2]


Dude, don't be mocking the e6400. I blast the pants off all the other dudes I know with there stock e6600, e6800's and what not at the lan party I go to. haha. If I remember correctly ALL stock with my old video card I ran low 11,xxx something on 3dmark05. Then when I Oced the e6400 I was hitting high 12,xxx to 13,000. Then with the video I was hovering just below or right on 14,000. However in games i deffinitly noticed a difference in FPS. I tried it at stock because of the power savings at idle but when I cranked a game on it just didn't do what it could OCed.

I also know my chipset is memory limited, the P35 chipset would do be better than the x975. And I have the Allendale core with 2MB Cache instead of 4MB.
February 7, 2008 12:01:50 PM

rallyimprezive said:
I agree, Im running at 750/1050 with no real performance increase that I can detect.


The reason why most of you are not seeing a big difference is because of the lack of ROPs. I own an 8800GTX overclocked to 650MHz core from stock of 575MHz core (check the rest in my signature), and I notice huge gains in Crysis. Not only did I get 15 FPS more in Crysis, but the game just ran much smoother than without the OC. The ROPs at the end of the silicon (G92 with 16 ROPs) is just a choke hold on scaling performance. I know that the G80 with its full 24 ROPs will see a larger gain when overclocking because of the extra ROPs not holding it back. :) 
February 7, 2008 12:09:36 PM

systemlord said:
The reason why most of you are not seeing a big difference is because of the lack of ROPS. I own an 8800GTX overclocked to 650MHz core from stock of 575MHz core (check the rest in my signature), and I notice huge gains in Crysis. Not only did I get 15 FPS more in Crysis, but the game just ran much smoother than without the OC. The ROPS at the end of the silicon (G92 with 16 ROPS) is just a choke hold on scaling performance. I know that the G80 with its full 24 ROPS will see a larger gain when overclocking because of the extra ROPS not holding it back. :) 


And oddly enough where on most G80s increasing Shader Power was more effective I've found that overclocking the ROP Area of the G92 to be more effective.

Unfortunately, you are right though. The G92 is a beast of a card being kept in a cage by a lack of ROPs...

All I wanted was a G92 based card with a 512Mbit Memory Interface, 1 GB GDDR3, and 24 ROPs. It might be too much to ask...

RV770 anyone?
February 7, 2008 12:13:20 PM

rallyimprezive said:
Now, I DID get the card all the way up to 800/1200 and noticed Crysis played better, for about 1 min, then it locked up. The temps were fine, but I dont think the memory liked running at an effective 2400mhz.


In my experience you get artifacts from overclocking VGA memory well before lock ups, so try bringing down the processor clock and see what happens.
February 7, 2008 1:14:01 PM

ethel said:
In my experience you get artifacts from overclocking VGA memory well before lock ups, so try bringing down the processor clock and see what happens.

What are the signs that the GPU is overclocked too much? same for memory? My guess from your comment: to generalize, excessive GPU overclocking pretty much locks up right away, but excessive gfx memory overclocking will display artifacts for multiple seconds or more before it locks up?
February 7, 2008 1:15:22 PM

Thanks for the tip.

Ok noob time, what are ROPS and how do I OC them? :( 
February 7, 2008 2:08:33 PM

rallyimprezive said:
Thanks for the tip.

Ok noob time, what are ROPS and how do I OC them? :( 


With Riva unlink Shaders to Core. Core is the ROP area.
a b Î Nvidia
February 7, 2008 2:31:13 PM

I guess since my card only have 16 ROPS and the GTX 24, as you increase the core speed the GTX has an exponential gain over the 8800gts. I understand now. Why didnt they have 24 in the new card, it would be a monster then. I guess they are saving room for a new GTX
February 7, 2008 2:31:18 PM

cnumartyr said:
With Riva unlink Shaders to Core. Core is the ROP area.



Can you go into more detail on what ROP actually is? I hate not having a full understanding, far beyond what I actually need.

I searched for articles on google and couldnt find anything.

If I at least know what it stands for I can dig deeper.
February 7, 2008 10:44:13 PM



This link really clears things up. Another way of looking at it for the simple folk is imagine a turbocharged V8 with lots of intake volume with a tiny rear pipe to point where the motor can not get air out of the tail pipe fast enough, G92 has a smaller pipe while the G80 GTX has a bigger pipe making more power with bigger gains.
a b Î Nvidia
February 8, 2008 10:55:04 AM

systemlord said:
This link really clears things up. Another way of looking at it for the simple folk is imagine a turbocharged V8 with lots of intake volume with a tiny rear pipe to point where the motor can not get air out of the tail pipe fast enough, G92 has a smaller pipe while the G80 GTX has a bigger pipe making more power with bigger gains.


Who in their right mind Turbocharges a V8 that isn't crazy? Haha, ok besides Deisel's. Or Lengenfelter Corvettes. I get the point though. It does make sense. I just wish nvidia wouldn't have engineered it this way. It doesn't leave much room for OCing and is soft of frustrating. So basically all of those XXX and Extreme OCed GTS's aren't worth beans.

Regardless it is still a wicked card. I was glad I got it and didn't need to get a new power supply to accommodate. That was a real selling point to me. I really am learning alot from this thread from everyone and the directions you are pointing me. I like to know everything about my PC and it's abilities... I feel bad for my buddy he bought 3 GTS's for 3-way SLI and just learned he needs Vista to do this. He was grumbling saying he should have just gotten 2 GTS's
February 9, 2008 5:53:51 AM

jay2tall said:
Who in their right mind Turbocharges a V8 that isn't crazy? Haha, ok besides Deisel's. Or Lengenfelter Corvettes. I get the point though. It does make sense. I just wish nvidia wouldn't have engineered it this way. It doesn't leave much room for OCing and is soft of frustrating. So basically all of those XXX and Extreme OCed GTS's aren't worth beans.

Regardless it is still a wicked card. I was glad I got it and didn't need to get a new power supply to accommodate. That was a real selling point to me. I really am learning alot from this thread from everyone and the directions you are pointing me. I like to know everything about my PC and it's abilities... I feel bad for my buddy he bought 3 GTS's for 3-way SLI and just learned he needs Vista to do this. He was grumbling saying he should have just gotten 2 GTS's


Ford and Chevy supercharges their full size trucks, my friend has one (Ford diesel) and all I see when he peels away is smoke with lots of rubber on the surface. I haven't been paying attention to Tripple SLI, but are you telling us we have to have Vista to use Tripple SLI? If so :ouch:  .

Heres something to think about, how long do you guys think it will be before we start seeing the PCI-E 2.0 spec actually being fully utilised on both wattage and bandwidth?
a b Î Nvidia
February 9, 2008 10:11:15 AM

I said except for deisels.... YES you have to have vista for Tri-SLI and unless you run it at super high resolution it actually performs worse. It's so powerful that it actually runs bad unless you use the power. That and you need a 1200W PSU

I saw a benchmark somewhere that the ATI cards, the 38XX series see an increase when using PCI-E 2.0 so I wonder if Nvida has a similar effect
February 9, 2008 10:21:53 AM

That OCing result makes me want to cry. Hopefully I can crank some good numbers out of the 3850 512mb when OCing it. I increased my 2900pro from 9,800 in 3DMark06 to 11k using 850/2000 as opposed to 507/1000 stock. God I love overclocking.
a b Î Nvidia
February 9, 2008 10:33:05 AM

If I remember correctly the 2900pro acutally used the same chip as a higher card and would OC well. The 3850 is a great card for the money. I'm not a fan of the single slot coolers so be leary of cooling. I didnt go with ATI even thought i have a crossfire board because it would only be (2) 8X PCI-e slots when in CF. I'll probably go for an SLI mobo next time.
February 11, 2008 3:52:11 AM

jay2tall said:
If I remember correctly the 2900pro acutally used the same chip as a higher card and would OC well. The 3850 is a great card for the money. I'm not a fan of the single slot coolers so be leary of cooling. I didnt go with ATI even thought i have a crossfire board because it would only be (2) 8X PCI-e slots when in CF. I'll probably go for an SLI mobo next time.


Yeah I remember when the 2900Pro came out and one review site was able to OC to 2900XT levels. I did the same with my 8800GTX now at Ultra levels. My Shader can safely hit 1600MHz and my memory can also hit 2100MHz, but I want to be on the safe side.
February 18, 2008 8:28:02 PM

wow, your all having better luck with the 8800gts 512 than I am. I got a pathetic 8k in 3dmark06 with a 8800gts 512, q6700, 8gigs of pc6400 ram, evga 780i mobo, 825watt pc power&cooling psu. I should be able to get 13k with nothing overclocked, dunno what the problem is
February 18, 2008 8:41:18 PM

an update:

I just started playing Colin McRae: Dirt again.

I saw a marked gain in performance with this particular game. I have not done FPS testing, and I probably wont. But even my wife noticed it saying "wow, that looks way better." And she doesnt give a rip about most of the games I play.

So, its subjective, I know, but I noticed it, and thats what matters right?
February 18, 2008 8:43:45 PM

flank2 said:
wow, your all having better luck with the 8800gts 512 than I am. I got a pathetic 8k in 3dmark06 with a 8800gts 512, q6700, 8gigs of pc6400 ram, evga 780i mobo, 825watt pc power&cooling psu. I should be able to get 13k with nothing overclocked, dunno what the problem is



That does seem low. Mind if I ask a few questions?

Have you done any troubleshooting?
What version of Windows?
Which driver version?
Did you do a clean install after removing the previous card?
Can you confirm memory clocks, FSB and CPU speed? Maybe something is set wrong in BIOS?

Kinda goin OT by askin, but Id be happy to help. :) 
February 18, 2008 10:29:31 PM

I'm on this post as single card champ with Vista and my evga 8800GTS 512@ 820/2008/2206 and 4Ghz for my Q6600 I watched my score jump up when I touched my memory speed, it seems to be the limiter. I just got a new cooler on it today. I just bought a 780i too :) . I go for the extreme though. I stuck it out side in the cold for that score lol . I'm half tempted to stick it back outside again and jack it up.

http://futuremark.yougamers.com/forum/showthread.php?t=...
February 18, 2008 10:36:18 PM

My rig before overclocking, Q6600 at 2.4GHz and 8800GTX 576/1350/1800 made 12600 3dMarks06 under Vista 32bit. So there is something definitely wrong with your pc. Try uninstalling nvidia drivers then restart in safe mode and run Driver Sweeper from guru3d. Then restart and install the latest nVidia drivers and see what happens.

As for G80 overclocking it seems to gain from both ROP and shaders clocks. G92 cards really need very high ROP clocks. I've had good results with G92 cards when I lowered a bit the shader clock in order to make the card withstand higher core clocks. However I believe that the G80 core really widens the gap between it and the G92 when used with quad cores overclocked to 3.5GHz and higher.
February 18, 2008 11:29:41 PM

i think you must not to OC your SP(StreamProcessor), it's the most important factor of the performance. and GPU's OC not make so great improve. The SP'S frequency about 2.x times(GPU's frequency about )
if your make your SP's frequency increase,the performance will make it's linear increase. and there were some tool could unlock the SP ,so......
February 19, 2008 12:38:35 AM

show it "the tool and what is sp ? Ok I'm going to push my new rig be back with my score and we will see if the 2.0 makes a difference or not.
!