HELP, GTX 970 is under clocking with k-boost on; EVGA Precision X 16 showing incorrect values?

BobCharlie

Distinguished
Sep 2, 2011
221
1
18,710
Card is a GTX 970 (MSI Gaming4G) on win7 64 Ult.. I'm running k-boost with current EVGA Precision 16 build and have the most recent nvidia driver, but something isn't right.


When you are looking at the EVGA Precision X 16, it has the value right in the upper center of what the core speed is supposedly at, which it's listing the same 1328 mhz it's always listed with k-boost on.

What I noticed however, is when you mouse over to the core increment slider, the lower portion of the circle changes to provide extra info on what you are adjusting. So if adjusting core for example, it'll show "current value" and "adjusted value" so you can see what the core speed is going to. The problem is that it's listing the core as being 214 mhz SLOWER (1114 mhz vs. the 1328 mhz reported above directly under where it says "GPU CLOCK").

gpuz is also listing 1114 mhz, next to GPU Clock under the main graphics tab, but under the sensor tab it's showing the higher 1328 value. Under the "Boost" category, it's listing 1253 mhz. To make matters more confusing, EVGA's white and red arrows that indicate core/boost, seem to be mirroring what gpuz is listing.


Any idea what's happening? Been playing Fallout 4 and noticed some textures are slow to fill in at times, and also getting some heavy FPS drops where it really shouldn't. The game auto-detects everything as Ultra with God Rays on high running 1920 x 1080. Temps are 34 idle, high 40's low 50's under load (got a big fan blowing into the case)
 
What you are seeing seems to look normal.

The boost clock is calculated before the card even leaves. Companies take cards with a boost clock of at least what the package says, but many times the card will boost even higher.

This is why people call it a lottery sometimes. I had 2 identical cards and one would boost higher. No 2 cores are the same so you are just getting a higher boost than the average for the card.

The boost clock being higher than what gpu-z shows should not be an issue because this was set at the factory(on a per chip basis.).

Fallout 4 is rather new and may take some driver revisions as well as game patches to get things all smoothed out.
 

BobCharlie

Distinguished
Sep 2, 2011
221
1
18,710
I understand the hard coded static values in the card, what I don't understand is why it's showing almost 200+ less core in real-time? If I up core, the 1114 will increase as well as the 1328. Ex: I add +10 core, the 1328 changes to 1338 (which is expected) and the 1114 will increase to 1124 (even though it's 200 less, it's incrementing at a 1:1 ratio and isn't remaining static)
 

BobCharlie

Distinguished
Sep 2, 2011
221
1
18,710
I googled some validation gpuz submittals and apparently the 1114 is what others with same card have. I just don't remember this value deviating when an OC was being applied.

To be clear, in my case w/o an actual OC, the 1328 with k-boost on is the correct, current speed?
 

BobCharlie

Distinguished
Sep 2, 2011
221
1
18,710
Yeah, was throwing me for a loop. I thought it listed "default" speeds as the 1114, and 1253 (boost), then was supposed to show the current speeds in the real-time boxes. As it sits now, it's listing default speeds as real-time speeds for some reason and reflecting changes as if the default clocks were being adjusted directly, then shows the the true speed elsewhere. Makes no sense.
 

TRENDING THREADS