Nvidia Shader Clock!!!!!!!!!!!!

kaspro

Distinguished
Jun 1, 2010
256
0
18,780
Hi, i currently have a radeon 5770 and i'm happy with it, especially that it overcomes Geforce Gts 250 and the new gts 450
and its even very match to the Gtx 260 and other cards, and my card is only 128bit memory bandwidth
but what's the shader clock that is used in all nvidia cards and doesnt seem to exist in ATI Cards?!
 

4745454b

Titan
Moderator
The shader clock is there, but AMD runs it at the same clock as the rest of the core. That means as you OC the core, your OCing the shaders as well. Nvidia's 8, 9, and GTX2xx series allowed you to OC the shaders independent of the core. The new GTX4xx series has moved so much out of the core and into where the shaders are you don't get to OC the shaders independent anymore. The shaders are now run at 2x the speed of what would be the core. Nvidia now calls it the half clock or uncore or something like that. Can't remember right now.

The important thing to remember is that it really doesn't matter. As you pointed out your 5770 is faster then the GTS250, 5750, or GTS450. Who cares if you can't OC the shaders separate from the core? On the same note who cares if its a 128bit bus? As long as it performs faster then X card, I don't care if its powered by 3 hamsters.
 

kaspro

Distinguished
Jun 1, 2010
256
0
18,780
but one more question....which is better, to have the shader clock as the core clock or twice the core as in Geforce......and what exactly the sahder clock is responsible for :D