I know there are many threads around the web with the same or a similar question, but none of them provide a definitive answer, so I thought I'd ask you guys: Is Afterburner reporting my core clock wrong or is it GPU-Z? Also, why does my voltage drop when I'm running a game? I am on a laptop using a GTX 670MX 3GB, overclocked with Afterburner by +135/+900 mhz, core and memory, respectively. I want to know if my OC settings are doing more harm than good. Here are some screenshots, using Battlefield 4 as an example.
I apologize in advance if I sound noobish when asking/describing things, as I'm only somewhat knowledgeable but not at the level that many of you guys are.
The first 2 are just so you can see the graphics settings (I set on Ultra to make sure my GPU was at 100%), and Afterburner's RivaTuner OSD reporting:
![]()
The next 2 are comparisons between Afterburner and GPU-Z, and GPU-Z showing my GPU info. BF4 is windowed so you can see all 3 reports at once, and ignore the framerate as it is not reflective of true performance (because it's running at a lower, windowed resolution):
![]()
This final one shows my voltage at max; compare to when BF4 is running (3rd image):
I'm not sure what to make of this. Is my GPU underclocking, is it getting too hot and lowering power, or something else? By the way, although my temps read 77-78c, they usually are at 81-83c when playing multiplayer.