What does 1080p gaming mean?

idolize

Honorable
Jul 23, 2013
37
0
10,530
I know 1080p means 1980x1020 res. on one monitor..
But people always say "that really expensive high end card is really good... but this medium card is fine for 1080p"
Does this mean that if your not running more than one monitor, a medium level graphics card (GTX 760) can max you on graphics settings with 60 fps typically?
 
Solution
Yeah, 1080p makes your graphic card generate 1920x1080 = 2,073,600 pixels or bits (say 2 Mbits). Multiply that times four bytes (32 bits of color per pixel) and you have about 8 MB of data for every screen refresh (60 times a second).

Compare that to 720p, which is 1280x720 = 921,600 pixels/bits, say 1 Mb. That makes 3.6 MB 60 times a second.

So 1080p doubles the raw data that the video card has to generate over 720p, not counting other effects such as anti-aliasing, etc. That's why increasing the screen resolution increases the load on the video card.
1080p just means 1080p, nothing special or mystical about it. Typically a card like a 760 is enough for what most people want at 1080p, though it wont max everything. Some games like Crysis 3 and Metro LL will only run maxed out on SLI setups, or 780/Titan. Usually you should only go faster than a 760 or 770 if you need it for 1440p or 3D or something.
 

adamsunderwood

Honorable
Jul 18, 2012
25
0
10,540
Basically, the higher the resolution that your game is being rendered at, the more pixels and the more complicated it is - the more difficult it is for the graphics card to render it. Of course a vast number of other factors also apply to how much difficulty a graphics card is going to have when rendering a game, but basically 1080p, or 1920x1080 resolution, is going to be significantly more stressful for the GPU than 720p, or 1280x720 resolution, because you have (1920 * 1080) - (1280 * 720) * 24 * 60 more bits of data being handled every second, given the exact same settings with textures, shadows, etc...
 

mbreslin1954

Distinguished
Yeah, 1080p makes your graphic card generate 1920x1080 = 2,073,600 pixels or bits (say 2 Mbits). Multiply that times four bytes (32 bits of color per pixel) and you have about 8 MB of data for every screen refresh (60 times a second).

Compare that to 720p, which is 1280x720 = 921,600 pixels/bits, say 1 Mb. That makes 3.6 MB 60 times a second.

So 1080p doubles the raw data that the video card has to generate over 720p, not counting other effects such as anti-aliasing, etc. That's why increasing the screen resolution increases the load on the video card.
 
Solution