My EVGA nvidia GTX 275 superclock (stock core clock: 648Mhz, memory: 1188Mhz, shader: 1458Mhz) is starting to show its age so I am starting to overclock my graphics card. I have read a lot of articles for the past couple of weeks and have picked up a lot of free software such as GPU-Z, furmark, 3dmark etc...
I am currently using EVGA Precision to manage my clocks and fan speeds and have boosted my clocks a small amount (core: 655Mhz memory: 1200Mhz shader: 1474Mhz). CPU-Z confirms my overclock on the graphics card tab. I then run a burn-in test in furmark to test stability. While running the test I am looking at the GPU-Z sensors for the current core clock measurement and it is still showing the stock clock of 648Mhz. The furmark data says also that
GPU-Z: core clock: 648Mhz, memory: 1188Mhz
GPU1: core: 655Mhz, memory: 1200Mhz
My question is if that current core clock measurement in GPU-Z is incorrect or if my card is not actually being overclocked? Is there a way to fix the cpu core clock measurement?
I think I have answered my own question. It looks like my overclocking was correct and the sensor that GPU-Z is using just has a limited sensitivity. I clocked up to core: 660Mhz and ran another furmark burn in test. This time both Precision and GPU-Z gave a measured core clock of 663Mhz so it looks like the sensor has a differential sensitivity of ~6Mhz.