Run game resolution higher

KingFaris10

Reputable
Nov 23, 2014
7
0
4,510
Hi,
My computer can run all games at 1080P 60FPS. I play Black Ops 3 often and I noticed that at resolutions higher than 1080P, I'm getting 60FPS. However, I'm not sure whether I should actually use these higher render resolutions as I'm not sure if it'll cause my graphics card temperature to rise more than it would playing at 1080P, or decrease the life span of the card more quickly. I usually play at 1080P due to these factors, but I'm not sure if they are even true... Can anyone give advice on whether I should raise the render resolution or not.

Specs:
Intel i7 4790K (Overclocked to 4.6GHz)
4GB NVIDIA GTX 970
 
Solution
What is the native resolution of your monitor? If you are using a resolution that is higher than the resolution of your monitor, then you are using DSR (nVidia's name for it). If your monitor is capable of a resolution higher than 1080p, then you should almost always run at the native resolution (interpolation sucks) unless your graphics card is incapable of rendering high enough framerates at that resolution.

As to it running hotter, it will. What you'd see if you are hitting 60Hz at 1080p versus a higher resolution at 60Hz would be your GPU usage would be higher. As your GPU usage goes up, so does the heat output. However if your temperatures are manageable and you don't find the added fan noise annoying, then it won't hurt...
What is the native resolution of your monitor? If you are using a resolution that is higher than the resolution of your monitor, then you are using DSR (nVidia's name for it). If your monitor is capable of a resolution higher than 1080p, then you should almost always run at the native resolution (interpolation sucks) unless your graphics card is incapable of rendering high enough framerates at that resolution.

As to it running hotter, it will. What you'd see if you are hitting 60Hz at 1080p versus a higher resolution at 60Hz would be your GPU usage would be higher. As your GPU usage goes up, so does the heat output. However if your temperatures are manageable and you don't find the added fan noise annoying, then it won't hurt anything. The GPU will throttle at high temps if it gets too high. Obviously having it throttle continuously isn't good for the longevity of your graphics card, but it's not going to damage it the first time that it happens. So if you run the higher resolution, monitor your temps and make sure they aren't getting out of hand.
 
Solution
Yeah, so you should expect a small amount of additional heat due to the increased load that you are placing on the GPU. If you use Afterburner with the OSD and monitor GPU usage, you'll see a given usage for 1080p and under the same conditions with DSR you'll see a higher GPU usage. However assuming your fans on your graphics card aren't running at 100% when running 1080p, they should compensate and ramp up under the higher load when your running your DSR resolution. So you may not in fact be running any hotter, maybe just noisier.

Personally DSR is OK. nVidia has done a pretty decent job with it and it's associated filtering. It works particularly well on games that have limited / no AA settings, but support DSR.
 
1) Your GPU fan will simply ramp up to keep temp down, or worst-case scenario throttle down the frequency. It's a non-issue.

2) In general I didn't find DSR useful, but then I have a 2560x1440 monitor so I just run that.

3) More heat?
not that it matters, but a higher resolution does NOT necessarily mean more heat. It may even mean less. If you have a lower resolution and don't have a cap set like with VSYNC then the GPU will simply render MORE FRAMES per second.

The GPU will simply keep running as fast as it's capable if you don't bottleneck it in some way. Admittedly, it's POSSIBLE that running a higher resolution will be slightly hotter but that would be most likely if the bottleneck shifts from the CPU to the GPU (if the CPU is bottlenecking things the GPU is waiting for commands and thus runs cooler).

You have an i7-4790K so CPU bottlenecking is going to be pretty rare.

Other:
I see you overclocked to 4.6GHz but if you run Prime95, and monitor in Task Manager what does your CPU frequency actually show? (under "Speed" next to "utilization")

I ask because for my i7-3770K, even if I apply a light overclock my Multipliers for the cores still drop as load increases by 100MHz (default is 1-core at 39, 2 at 38, 3 at 37 and 4-core usage at 36)

*I did a light overclock to 4.2GHz, but also manually changed the MULTIPLIERS so that I never drop below 4.1GHz (shows as 4.06GHz in Task Manager).

So..
If that's the case for YOU, then the default 4.4GHz setup can give you similar performance. Maybe try a light overclock with no voltage adjustment (to keep heat/noise minimal) and raise the multipliers manually and test stability.

If you managed 4.4GHz under full load you may even be slightly FASTER than you are now with reduced temp but I don't know your exact setup (it's not really a huge deal though).
 


If you're running noisier, then you ARE running hotter. The fans are based on GPU temperature.
 


If he's at 60Hz at both resolution, he is hitting VSYNC. Which means that he isn't running at 100% GPU usage at either resolution though his GPU usage will be higher at the DSR resolution. If the GPU usage is higher, its only logical that it's using more power. So either it's going to run hotter or the fan is going to run faster to keep up.

It's the same principle as turning off VSYNC and running at a higher FPS that the refresh of the monitor. Except in this case both resolutions are capped at 60FPS / 60Hz, but the higher resolution is going to be pushing the GPU harder. Any time you run a higher resolution (assuming the same detail settings for both resolutions) at the same FPS, the GPU is going to have to work harder to render for the higher resolution. It won't render those extra pixels for free.
 


Fine, but I'm speaking in general and not assuming he uses VSYNC in every game. I also did discuss VSYNC as a cap so really no need to point that out.
 
I made the assumption (though correctly I believe) that he has VSYNC enabled since he is hitting 60FPS at both resolutions (since there is no way he's CPU bottlenecked with a 4790 @ 4.6GHz). Even if he was using RTSS to cap at 60FPS, the situation would be the same. As long as the GPU usage isn't at 100% at both resolutions, the GPU is going to have to work harder at the higher resolution. How do we know the GPU usage isn't at 100% at both resolutions? The framerate is capped at 60FPS at both resolutions.