Overclocking vs. GPU-Z core clock current measurement

My EVGA nvidia GTX 275 superclock (stock core clock: 648Mhz, memory: 1188Mhz, shader: 1458Mhz) is starting to show its age so I am starting to overclock my graphics card. I have read a lot of articles for the past couple of weeks and have picked up a lot of free software such as GPU-Z, furmark, 3dmark etc...

I am currently using EVGA Precision to manage my clocks and fan speeds and have boosted my clocks a small amount (core: 655Mhz memory: 1200Mhz shader: 1474Mhz). CPU-Z confirms my overclock on the graphics card tab. I then run a burn-in test in furmark to test stability. While running the test I am looking at the GPU-Z sensors for the current core clock measurement and it is still showing the stock clock of 648Mhz. The furmark data says also that
GPU-Z: core clock: 648Mhz, memory: 1188Mhz
GPU1: core: 655Mhz, memory: 1200Mhz

My question is if that current core clock measurement in GPU-Z is incorrect or if my card is not actually being overclocked? Is there a way to fix the cpu core clock measurement?
3 answers Last reply
More about overclocking core clock current measurement
  1. I think I have answered my own question. It looks like my overclocking was correct and the sensor that GPU-Z is using just has a limited sensitivity. I clocked up to core: 660Mhz and ran another furmark burn in test. This time both Precision and GPU-Z gave a measured core clock of 663Mhz so it looks like the sensor has a differential sensitivity of ~6Mhz.
  2. Thats weird mine is usually right on. So long as its close and working good, sounds good to me.
  3. Yea I think now that it is actually just a limited sensitivity with my overclocking software or the graphics card itself. I think that GPU-Z has the correct measurement.
Ask a new question

Read More

Graphics Cards Overclocking Core GPUs Memory