Difference between... shader

P486

Distinguished
Apr 2, 2010
7
0
18,510
What's the difference between shader speed and core speed ? When they come in use?
P.P can someone give me good book with basic materials or some FAQ site with the nuts and bolts of hardware logic and algoritms?
 
Solution
ATI has more shaders,but they're at a slower (unchangeable) speed.
Not exactly, the ATI shaders are linked to core clock and increase when Core is increased, where as the nVida shaders are can be run in unlinked and can be changed independent of each other.

The core clock runs some functions on the multiprocessor level, like the instruction decoder, and the shader clock runs the individual processors. The shader clock sets the speed of arithmetic operations by the processor.
ATI has more shaders,but they're at a slower (unchangeable) speed.
Not exactly, the ATI shaders are linked to core clock and increase when Core is increased, where as the nVida shaders are can be run in unlinked and can be changed independent of each other.

The core clock runs some functions on the multiprocessor level, like the instruction decoder, and the shader clock runs the individual processors. The shader clock sets the speed of arithmetic operations by the processor.
 
Solution
^ Not exactly. The shader clock is an unusual thing. For some games, it helps to increase shader clock, in other cases it doesn't (I'm talking about nVidia cards here), this is also influenced by the card itself, for example the 8800GT benefits from a bit of a shader increase in most games, where as a 8800GTX's shader increase doesn't really change much. As for ATI, since the shader and core are linked, we can't tell the difference between shader vs core speed increases.