4K UHDTV (2160p) has a resolution of 3840 × 2160 (8.3 megapixels), 4 times the pixels of 1080p.
8K UHDTV (4320p) has a resolution of 7680 × 4320 (33.2 megapixels), 16 times the pixels of current 1080p HDTV, which brings it to roughly the detail level of 15/70mm IMAX. NHK advocates the 8K UHDTV format with 22.2 surround sound as Super Hi-Vision.
I cant see GPU's powering all those pixels for at least 4 years JUST for the 2160p, im guessing its gonna need at least 10GB of Vram (32GB for the 4320p ). Also stating the obvious it will prob take Game developers 6-8 years to catch on, unless the PS5 and Xbox who cares, support those resolutions.
Just curious what every one else thinks, Im looking forward to OLED and ultra HD.
Yes, I am pretty sure there are GPU's out that can already handle 4k. I read an article about the new 4k TV's and I am pretty sure they specifically mentioned either a single GPU or a not-too-crazy SLI set-up was running the image.
Supporting output of 3840x2160 is different from being able to render a game at playable framerates with that resolution. However, with those resolutions at desktop monitor sizes (32 inch diagonal or smaller), anti-aliasing might seem unnecessary - I mean, for me, anti-aliasing already seems unnecessary with a 2560x1440 27 inch monitor. Handling 3840x2160 NoAA might not be that far off.
Edit: after looking up some benchmarks for 5760x1080 (triple-screen), I think it might already be possible to get decent framerates at 3840x2160 with 4 or more of the latest-generation gpus, like two gtx 690s tied together (if that's possible).
"Supporting output of 3840x2160 is different from being able to render a game at playable framerates with that resolution."
If my 6850 CF, can barely play BF3 on 1080P on max settings, I have huge doubts anything can play BF3 on max settings on 4 times the resolution. The 690 would prob run into a Vram bottleneck (speculating (the 690 has 4x the GB as the 6850)). Surely the extra resolution means more work for the gpu, isnt it technically 4x the pixels that need to be processed.
BTW The discussion is about gaming on the ultra HD TV's.
Also I wonder if the higher resolution would make games look better, or if the engine their built on would be the visual limitation.
Well I think too that it will take some time, before single GPUs can run the latest game on such a resolution. But not because GPUs will lack performance, it's only because that kind of performance isn't needed in a lot of cases. Most people won't afford a 4k display in several years ahead. So I believe full HD will be the ideal resolution for some years on.
AS long as full HD is the ideal resolution, then GPUs won't be made strong enough to power such displays at highest settings with an acceptable framerate in the latest games.