Sign in with
Sign up | Sign in
Your question
Solved

Difference between... shader

Last response: in Graphics & Displays
Share
April 3, 2010 4:56:54 PM

What's the difference between shader speed and core speed ? When they come in use?
P.P can someone give me good book with basic materials or some FAQ site with the nuts and bolts of hardware logic and algoritms?

More about : difference shader

a c 130 U Graphics card
April 3, 2010 5:26:02 PM

Shader speed is how fast (in mhz) the shaders operate. nVidia has less shaders, but they operate at a higher frequency, which one can overclock. ATI has more shaders, but they're at a slower (unchangeable) speed.
m
0
l

Best solution

a b U Graphics card
April 3, 2010 6:19:04 PM

Quote:
ATI has more shaders,but they're at a slower (unchangeable) speed.

Not exactly, the ATI shaders are linked to core clock and increase when Core is increased, where as the nVida shaders are can be run in unlinked and can be changed independent of each other.

The core clock runs some functions on the multiprocessor level, like the instruction decoder, and the shader clock runs the individual processors. The shader clock sets the speed of arithmetic operations by the processor.
Share
Related resources
a c 130 U Graphics card
April 3, 2010 6:50:03 PM

Oh, so increasing ATI core speed has a more pronounced effect than nVidian core speed?
m
0
l
a b U Graphics card
April 3, 2010 8:09:08 PM

^ Not exactly. The shader clock is an unusual thing. For some games, it helps to increase shader clock, in other cases it doesn't (I'm talking about nVidia cards here), this is also influenced by the card itself, for example the 8800GT benefits from a bit of a shader increase in most games, where as a 8800GTX's shader increase doesn't really change much. As for ATI, since the shader and core are linked, we can't tell the difference between shader vs core speed increases.
m
0
l
April 4, 2010 9:31:18 AM

Best answer selected by P486.
m
0
l
!