Solved

Difference between... shader

What's the difference between shader speed and core speed ? When they come in use?
P.P can someone give me good book with basic materials or some FAQ site with the nuts and bolts of hardware logic and algoritms?
5 answers Last reply Best Answer
More about difference between shader
  1. Shader speed is how fast (in mhz) the shaders operate. nVidia has less shaders, but they operate at a higher frequency, which one can overclock. ATI has more shaders, but they're at a slower (unchangeable) speed.
  2. Best answer
    Quote:
    ATI has more shaders,but they're at a slower (unchangeable) speed.

    Not exactly, the ATI shaders are linked to core clock and increase when Core is increased, where as the nVida shaders are can be run in unlinked and can be changed independent of each other.

    The core clock runs some functions on the multiprocessor level, like the instruction decoder, and the shader clock runs the individual processors. The shader clock sets the speed of arithmetic operations by the processor.
  3. Oh, so increasing ATI core speed has a more pronounced effect than nVidian core speed?
  4. ^ Not exactly. The shader clock is an unusual thing. For some games, it helps to increase shader clock, in other cases it doesn't (I'm talking about nVidia cards here), this is also influenced by the card itself, for example the 8800GT benefits from a bit of a shader increase in most games, where as a 8800GTX's shader increase doesn't really change much. As for ATI, since the shader and core are linked, we can't tell the difference between shader vs core speed increases.
  5. Best answer selected by P486.
Ask a new question

Read More

Graphics Cards Graphics