Sign in with
Sign up | Sign in
Your question

[Help!] With Overclocking 8500GT

Last response: in Overclocking
Share
a b K Overclocking
January 23, 2008 9:15:27 PM

Well my computer now is in a Antec Nine hundred with 5 fans that brought my GPU Down from 80C to 35C.

My Graphic card came as a 500Mhz Core Bus and 333Mhz Memory Bus. So far i Overclocked it from 500Mhz to 555Mhz and 333Mhz to 520Mhz with no problems. I've been scanning for Artifacts with ATI Tools and it hasn't found any and im getting way faster FPS than usual. My Graphic card currently is 40C and stable. Could i be able to crank it up more?

More about : overclocking 8500gt

January 23, 2008 9:19:20 PM

Yes you can, until your pc crashes or freezes during games.

You can have a little artifact in ATI tools, it will usually never show in games.
a b K Overclocking
January 23, 2008 9:22:21 PM

Well Evilonigiri im getting no artifacts at all. What do you recommend me to do right now?
Related resources
January 23, 2008 9:26:02 PM

trihedral said:
Well Evilonigiri im getting no artifacts at all. What do you recommend me to do right now?

OC away! In increments of 10MHz if you want to be cautious.
a b K Overclocking
January 23, 2008 9:40:04 PM

All right thanks, so is there a hight i should be scared to go? like the max possible to go is 1000Mhz which i dont think i can reach O.o....

Oh yeah, i also wanted to ask if i can overclock the Core and Memory at the same time, or Memory First?
January 23, 2008 9:43:19 PM

trihedral said:
All right thanks, so is there a hight i should be scared to go? like the max possible to go is 1000Mhz which i dont think i can reach O.o....

Oh yeah, i also wanted to ask if i can overclock the Core and Memory at the same time, or Memory First?

Do one or the other first.

And before that, is the Core clock and Shader clock synced or not?
a b K Overclocking
January 23, 2008 9:46:57 PM

I dont know what a synced is. So far this is what i did. Inject Coolbits in and opened my NVIDIA Control Panel and went to PerFormance. Dotted on Custom Clock Frequencies and just adjusting it up and up. Also watching if it has problems with 3DMark01SE and 3DMark06. Frames has been getting a 30% increase or more. Temperature is 39-40C and stable(checked it with NVIDIA Monitor).

But, what is the Synced Shader clock?
a b K Overclocking
January 23, 2008 9:49:28 PM

What does Syncing it do? btw im 560 Mhz Core and 540 Mhz Memory now with no issues. Sec. downloading RivaTuner.
January 23, 2008 9:51:05 PM

trihedral said:
I dont know what a synced is. So far this is what i did. Inject Coolbits in and opened my NVIDIA Control Panel and went to PerFormance. Dotted on Custom Clock Frequencies and just adjusting it up and up. Also watching if it has problems with 3DMark01SE and 3DMark06. Frames has been getting a 30% increase or more. Temperature is 39-40C and stable(checked it with NVIDIA Monitor).

But, what is the Synced Shader clock?

I'm gonna help you maximize all the clock speeds, assuming temp isn't a problem.

I should have said link instead of synced. My bad.

When you increase the coreclock, the shader clock increases proportionally with the coreclock. Usually when you hit the "max" core clock speed, it's the shader clock's fault. So by unlinking it, you can OC them independently from each other, allowing you to go higher.
a b K Overclocking
January 23, 2008 9:54:30 PM

Oh i get it, its like the Core clock can go more, but the Shader can't so you divide them into 2 sections and max each one out alone. Correct?


Also you told me on my old thread that my computer and graphic card was basicly too hot to overclock so i begged my dad to buy me a Nine Hundred Case on sell at COMPUSA. Haha, very appreciate the help.
a b K Overclocking
January 23, 2008 9:57:05 PM

I have an issue with Rivatuner right now, As it got done loading and updating, it comes out a message saying Unsupported Driver Detected.

Warning! RivaTuner has not been tested with the currently installed display driver.
It is strongly recommended to upgrade RivaTuner to a new Version, otherwise it may not work properly because the driver is not officially supported.

[] Don't show this warming till installing new unsupported Driver

[ OK ]


What do i do now?
January 23, 2008 9:57:42 PM

trihedral said:
I have an issue with Rivatuner right now, As it got done loading and updating, it comes out a message saying Unsupported Driver Detected.

Warning! RivaTuner has not been tested with the currently installed display driver.
It is strongly recommended to upgrade RivaTuner to a new Version, otherwise it may not work properly because the driver is not officially supported.

[] Don't show this warming till installing new unsupported Driver

[ OK ]


What do i do now?

Go download the newest drivers from nvidia's website. ^^
January 23, 2008 9:58:48 PM

trihedral said:
Oh i get it, its like the Core clock can go more, but the Shader can't so you divide them into 2 sections and max each one out alone. Correct?


Also you told me on my old thread that my computer and graphic card was basicly too hot to overclock so i begged my dad to buy me a Nine Hundred Case on sell at COMPUSA. Haha, very appreciate the help.

Indeed you are correct.
EDIT:You're catching on real fine. :) 
a b K Overclocking
January 23, 2008 10:01:02 PM

For the Graphic card? i have the lastest Firmware for my Graphic card right now which is the 169.25 for GeForce8500Gt at www.xfxforce.com..

Could there be another problem?
January 23, 2008 10:03:23 PM

trihedral said:
For the Graphic card? i have the lastest Firmware for my Graphic card right now which is the 169.25 for GeForce8500Gt at www.xfxforce.com..

Could there be another problem?

Oh then you're fine. There won't be any issues.
a b K Overclocking
January 23, 2008 10:06:52 PM

Alright, ive changed the values to 1 but i cant find where the system Tweaks is in order to uncheck link clocks. By the way, so after i do that i can just overclock it on Rivatuner instead of going into NVIDIA Control panel right?
January 23, 2008 10:09:52 PM

trihedral said:
Alright, ive changed the values to 1 but i cant find where the system Tweaks is in order to uncheck link clocks. By the way, so after i do that i can just overclock it on Rivatuner instead of going into NVIDIA Control panel right?

Yes, you'll be able to use Riva tuner to OC your vidcard. Just remember to hit save settings on start up.

So you did every step?
a b K Overclocking
January 23, 2008 10:12:52 PM

When i click Enable Driver level hardware overclocking, it makes me reboot my computer to detect the frequencies. Do i restart the computer?
January 23, 2008 10:13:23 PM

trihedral said:
When i click Enable Driver level hardware overclocking, it makes me reboot my computer to detect the frequencies. Do i restart the computer?

Nope, you don't need too.
a b K Overclocking
January 23, 2008 10:17:22 PM

eww, just restarted my computer for no reason... oh well..
a b K Overclocking
January 23, 2008 10:19:02 PM

Ahh im so mad, when i overclock on NVIDIA Control panel then restart my computer even though i clicked apply it goes back to 500Mhz and 333Mhz to factory shiped defalt. UGH!
January 23, 2008 10:20:05 PM

trihedral said:
Ahh im so mad, when i overclock on NVIDIA Control panel then restart my computer even though i clicked apply it goes back to 500Mhz and 333Mhz to factory shiped defalt. UGH!

There's not point in being mad. Just bring the clock speeds back up again. Takes 10secs.
a b K Overclocking
January 23, 2008 10:20:49 PM

Also what the heck Memory clock only goes up to 500Mhz...
January 23, 2008 10:23:02 PM

trihedral said:
Also what the heck Memory clock only goes up to 500Mhz...

There is a way around it. You gotta do all this things, but here's the easy way out:

Use Riva tuner to OC the core and shader.

Use ntune to OC the memory.
a b K Overclocking
January 23, 2008 10:23:03 PM

How come when i set it, it doesn't show up on NVIDIA Monitor?
January 23, 2008 10:23:33 PM

trihedral said:
How come when i set it, it doesn't show up on NVIDIA Monitor?

Did you hit apply?
a b K Overclocking
January 23, 2008 10:25:31 PM

So what i should do right now is just overclock the Core and Shader, leave the Memory Clock at 333Mhz. Then later on go onto Ntune and clock the Memory from there?
January 23, 2008 10:28:28 PM

trihedral said:
So what i should do right now is just overclock the Core and Shader, leave the Memory Clock at 333Mhz. Then later on go onto Ntune and clock the Memory from there?

Yeah. If that doesn't work, do it the other way around. I'm looking for ways to bypass the limit atm.
a b K Overclocking
January 23, 2008 10:31:55 PM

I dont think Riva Works on my Graphic card, as much as i tune it up and down it stays at 500 and 333.
a b K Overclocking
January 23, 2008 10:43:58 PM

It keeps comming out the download for the 2.06. Plus on the 2.0 Version there isnt a Shader + Core split.
a b K Overclocking
January 23, 2008 10:45:44 PM

Nevermind its working now after ive transfered the NVIDIA control panel settings to the Rivatuner.
January 23, 2008 10:47:06 PM

trihedral said:
Nevermind its working now after ive transfered the NVIDIA control panel settings to the Rivatuner.

So everything is working properly now?
a b K Overclocking
January 23, 2008 10:48:33 PM

Can you tell me whats the difference of the Shader clock, core clock and a memory clock?
January 23, 2008 10:52:13 PM

trihedral said:
Can you tell me whats the difference of the Shader clock, core clock and a memory clock?

Not very well but here it goes...

Higher core clock helps in textures and stuff.

Higher memory clock increases bandwidth.

Higher shader clock helps in...well shadows. SM 3.0 stuff.

I'm not entirely sure that I am correct, but that's the gist of it.
a b K Overclocking
January 23, 2008 10:54:25 PM

Haha okay. Well now i can control the Memory Clock with Ntune and the Shader with Core clock with the Riva tuner now. So basicly all i do right now is just overclock as much as i want until theres a problem i go back to where it was at best?
January 23, 2008 10:56:15 PM

trihedral said:
Haha okay. Well now i can control the Memory Clock with Ntune and the Shader with Core clock with the Riva tuner now. So basicly all i do right now is just overclock as much as i want until theres a problem i go back to where it was at best?

Yes. Use ATI tools to scan for artifacts. Just don't use it's OCing feature.

I would start out with core clock, then shader clock, finally memory clock.
January 23, 2008 10:57:06 PM

Oh and I would like to add to remember to use the Rivatuner's gpu monitor feature.
a b K Overclocking
January 23, 2008 11:02:56 PM

Yeah the ATI tools is getting me around 178 FPS, and so far ive made the Core at 600Mhz, Shader 1200Mhz, Memory 550Mhz. And its doing pretty well.
January 23, 2008 11:06:34 PM

trihedral said:
Yeah the ATI tools is getting me around 178 FPS, and so far ive made the Core at 600Mhz, Shader 1200Mhz, Memory 550Mhz. And its doing pretty well.

Hit view 3d window and watch your temps climb up. If you can watch the window for 2min (my guess) without freezing/ locking up, the clocks are fine.

If you want to be cautious, play some games or use 3dmark and leave it on loop.
a b K Overclocking
January 23, 2008 11:08:57 PM

I dont think my GPU is going so goood right now. The screen blinks for a second and comes out a message at the bottom left saying the Graphics Driver has stoped working and has been recovered so i set it back to 500Mhz and 333mhz. Do i save it at the point below of where it would stop working?
January 23, 2008 11:11:28 PM

trihedral said:
I dont think my GPU is going so goood right now. The screen blinks for a second and comes out a message at the bottom left saying the Graphics Driver has stoped working and has been recovered so i set it back to 500Mhz and 333mhz. Do i save it at the point below of where it would stop working?

You can say that.

Test it out in some games or 3dmark. If it freezes/artifacts lower the clocks.

EDIT: If an OC fails, restart your PC. And don't hit save settings on startup.
a b K Overclocking
January 23, 2008 11:39:52 PM

Well now i get what you ment when you said to clock one by one. Clocking them at the same time will make you unsure which is the one that was the problem. ive clocked all of them the same time and while watching 3D Mark my Screen Froze. So now im going core from 570 and going up 10Mhz per time. When it freezes i would go back to when it didn't crash and call it stable point.

Example:
570 clock to 580(no problem) Clock to 590 (no problem) clock to 600(freeze) So then i would restart computer and save the Core clock as 590(when it didnt freeze) then move onto Shader clock and Memory etc.

Is that a good plan?
January 23, 2008 11:41:37 PM

trihedral said:
Well now i get what you ment when you said to clock one by one. Clocking them at the same time will make you unsure which is the one that was the problem. ive clocked all of them the same time and while watching 3D Mark my Screen Froze. So now im going core from 570 and going up 10Mhz per time. When it freezes i would go back to when it didn't crash and call it stable point.

Example:
570 clock to 580(no problem) Clock to 590 (no problem) clock to 600(freeze) So then i would restart computer and save the Core clock as 590(when it didnt freeze) then move onto Shader clock and Memory etc.

Is that a good plan?

Excellent! I'm glad you're understanding all the procedures here. :bounce: 

So 590MHz is the highest? Not bad, considering there's no fan.

EDIT:Nice avatar btw
a b K Overclocking
January 23, 2008 11:45:27 PM

Amazing what Flash can do lol.

Actually, 575 is the max stable because when i tried going onto 580 and apply then ok it goes back down to 575. Probably because i need to also increase the shader in order to futher out the Core.
January 23, 2008 11:50:05 PM

trihedral said:
Amazing what Flash can do lol.

Actually, 575 is the max stable because when i tried going onto 580 and apply then ok it goes back down to 575. Probably because i need to also increase the shader in order to futher out the Core.

There is one more way to increase your clock speed. It may not work with all vidcards, and it has a potential of screwing up the SATA ports.

It's called the PCIe frequency. Default it 100MHz. Increasing it will increase CPU speeds very slightly and increase bandwidth of the PCIe. Setting it too high will cause issue with SATA ports (the thing the HD is connected to).

If you want to risk it, set PCIe frequency to 115MHz and perhaps that'll help your OC more.
a b K Overclocking
January 24, 2008 12:00:55 AM

Nah im not going to mess me up that bad lol.
a b K Overclocking
January 24, 2008 12:02:56 AM

So Far i found out that, each 10 Shader Mhz is needed to clock up 5Mhz Core. That is the reason why my Driver stoped working because i brought the Shader up to 200 without the Core.
January 24, 2008 12:04:52 AM

trihedral said:
So Far i found out that, each 10 Shader Mhz is needed to clock up 5Mhz Core. That is the reason why my Driver stoped working because i brought the Shader up to 200 without the Core.

That's probably why you start with core first.

I may have scared you too much, but 115MHz is considered safe.
!