Sign in with
Sign up | Sign in
Your question

Overclocking vs. GPU-Z core clock current measurement

Last response: in Overclocking
Share
March 6, 2011 3:07:38 AM

My EVGA nvidia GTX 275 superclock (stock core clock: 648Mhz, memory: 1188Mhz, shader: 1458Mhz) is starting to show its age so I am starting to overclock my graphics card. I have read a lot of articles for the past couple of weeks and have picked up a lot of free software such as GPU-Z, furmark, 3dmark etc...

I am currently using EVGA Precision to manage my clocks and fan speeds and have boosted my clocks a small amount (core: 655Mhz memory: 1200Mhz shader: 1474Mhz). CPU-Z confirms my overclock on the graphics card tab. I then run a burn-in test in furmark to test stability. While running the test I am looking at the GPU-Z sensors for the current core clock measurement and it is still showing the stock clock of 648Mhz. The furmark data says also that
GPU-Z: core clock: 648Mhz, memory: 1188Mhz
GPU1: core: 655Mhz, memory: 1200Mhz

My question is if that current core clock measurement in GPU-Z is incorrect or if my card is not actually being overclocked? Is there a way to fix the cpu core clock measurement?
March 6, 2011 3:27:39 AM

I think I have answered my own question. It looks like my overclocking was correct and the sensor that GPU-Z is using just has a limited sensitivity. I clocked up to core: 660Mhz and ran another furmark burn in test. This time both Precision and GPU-Z gave a measured core clock of 663Mhz so it looks like the sensor has a differential sensitivity of ~6Mhz.
m
0
l
a b K Overclocking
March 8, 2011 8:19:06 PM

Thats weird mine is usually right on. So long as its close and working good, sounds good to me.
m
0
l
April 6, 2011 9:41:13 PM

Yea I think now that it is actually just a limited sensitivity with my overclocking software or the graphics card itself. I think that GPU-Z has the correct measurement.
m
0
l
!