GPU Z and evga precision give different values for gpu clock speed

jamesa82

Reputable
Nov 17, 2014
66
0
4,630
Hello everyone i ran into an interesting situation and was hoping i could get some answers.
I was messing around with my gtx evga 970 ftw fan speeds in the hopes of decreasing the temperature slightly (it was reaching high 70's) once i was happy with the fan speed i looked at gpu clock offset but didn't mess with it it because at the moment i'm happy with the performance. I did however notice that the current clock speed even after resetting to default is 1215 mhz not 1216 mhz. I know it's only 1 mhz but it still confused me slightly. i checked in gpu z and it says the value i expected to see and the value listed on evga's website for clock speeds as gpu z says im running at 1216 mhz i.e the correct value.
so why are there 2 different readings, which one's the right one and is there anything to worry about.
if anyone has experience with evga precision and has seen something similar i'd really appreciate their thoughts on it, i know it's minor but it's quite an expensive card and i want to make sure everything is okay.
Thanks in advance for any help you can give.
 
Solution
Nothing to worry about. It's the way the programmers set up the algorithm for frequency monitoring. Probably some decimal point number in the coding is a little different between the two programs. Think of it as a rounding up/down number variation.

If you have CPU-Z, you'll note that your CPU speed displays to the thousandth of a decimal (3.904 GHz as an example for a CPU rated at 3.9 GHz).
Nothing to worry about. It's the way the programmers set up the algorithm for frequency monitoring. Probably some decimal point number in the coding is a little different between the two programs. Think of it as a rounding up/down number variation.

If you have CPU-Z, you'll note that your CPU speed displays to the thousandth of a decimal (3.904 GHz as an example for a CPU rated at 3.9 GHz).
 
Solution