GTX 1080 for 1440p @ 60Hz

G

Guest

Guest
Do you think a GTX 1080 would be a waste for a 60Hz 1440p display? The GTX 1070 seems to handle it more than fine, but I'd like it to last quite long without upgrading... Thoughts?
Another question: If the framerate is limited to 60 by V-Sync and the card's not running at 100%, it would lower the temps, right?
 
Solution
Unless you plan to upgrade every couple of years , i think that the 1080 is a good fit for a 1440p monitor. As for the vsync yes you should have a decrease in temps since (and if) the card's load is not maxed out. And if you change your monitor to 4k again you'd be good to go.

CBender

Reputable
Dec 30, 2015
1,018
1
5,960
Unless you plan to upgrade every couple of years , i think that the 1080 is a good fit for a 1440p monitor. As for the vsync yes you should have a decrease in temps since (and if) the card's load is not maxed out. And if you change your monitor to 4k again you'd be good to go.
 
Solution
G

Guest

Guest


Yeah, exactly the thing I was thinking about. I really need it to serve long term, probably better too when games get more demanding.
About the V-Sync, how exactly does it keep the framerate at 60? Does it lower the clock speed (underclock) or what?
 

CBender

Reputable
Dec 30, 2015
1,018
1
5,960
I don't think that it works that way. I think that on a software/game level there is a threshold being 'programmed'. But that results in a lesser load so at the end the clock won't reach the same boost speed as it would with an 'open' framerate. Could be wrong though.
 
Vysnc sinks your GPU up to your monitor so every frame is sent correctly so their is no screen tearing. The number of synced frames will depend on your monitors refresh rate (example: 60HZ), though if the GPU fails to constantly send the correct number (say you drop to 45fps) you will run into stutter and a not very smooth experience.

A none vysnced game will let the GPU throw its max load of frames at the screen but in most cases this is a waste say if you had a 60HZ monitor and the GPU was making a 100+ frames per second then about 40+frames are being wasted as the monitor can only display 60 of the 100+being given.