Card is a GTX 970 (MSI Gaming4G) on win7 64 Ult.. I'm running k-boost with current EVGA Precision 16 build and have the most recent nvidia driver, but something isn't right.
When you are looking at the EVGA Precision X 16, it has the value right in the upper center of what the core speed is supposedly at, which it's listing the same 1328 mhz it's always listed with k-boost on.
What I noticed however, is when you mouse over to the core increment slider, the lower portion of the circle changes to provide extra info on what you are adjusting. So if adjusting core for example, it'll show "current value" and "adjusted value" so you can see what the core speed is going to. The problem is that it's listing the core as being 214 mhz SLOWER (1114 mhz vs. the 1328 mhz reported above directly under where it says "GPU CLOCK").
gpuz is also listing 1114 mhz, next to GPU Clock under the main graphics tab, but under the sensor tab it's showing the higher 1328 value. Under the "Boost" category, it's listing 1253 mhz. To make matters more confusing, EVGA's white and red arrows that indicate core/boost, seem to be mirroring what gpuz is listing.
Any idea what's happening? Been playing Fallout 4 and noticed some textures are slow to fill in at times, and also getting some heavy FPS drops where it really shouldn't. The game auto-detects everything as Ultra with God Rays on high running 1920 x 1080. Temps are 34 idle, high 40's low 50's under load (got a big fan blowing into the case)
When you are looking at the EVGA Precision X 16, it has the value right in the upper center of what the core speed is supposedly at, which it's listing the same 1328 mhz it's always listed with k-boost on.
What I noticed however, is when you mouse over to the core increment slider, the lower portion of the circle changes to provide extra info on what you are adjusting. So if adjusting core for example, it'll show "current value" and "adjusted value" so you can see what the core speed is going to. The problem is that it's listing the core as being 214 mhz SLOWER (1114 mhz vs. the 1328 mhz reported above directly under where it says "GPU CLOCK").
gpuz is also listing 1114 mhz, next to GPU Clock under the main graphics tab, but under the sensor tab it's showing the higher 1328 value. Under the "Boost" category, it's listing 1253 mhz. To make matters more confusing, EVGA's white and red arrows that indicate core/boost, seem to be mirroring what gpuz is listing.
Any idea what's happening? Been playing Fallout 4 and noticed some textures are slow to fill in at times, and also getting some heavy FPS drops where it really shouldn't. The game auto-detects everything as Ultra with God Rays on high running 1920 x 1080. Temps are 34 idle, high 40's low 50's under load (got a big fan blowing into the case)