Sign in with
Sign up | Sign in

How Has Nvidia Managed to Push 60Hz 4K Over HDMI 1.4?

By - Source: Nvidia Forum | B 26 comments

Nvidia has used an old trick to compress video images in order to get a 60 Hz 4K signal over the limited HDMI 1.4 interfaces on Kepler cards.

Normally, in order to run a 4K signal over HDMI at 60 Hz, you’d need an HDMI 2.0 connection. 'Normally,' isn’t really the right word though, as you cannot yet buy any graphics cards with an HDMI 2.0 output. However, it seems that the GeForce 340.43 Beta driver release had made it possible to push 4K at 60 Hz out of Kepler based cards, despite not having the HDMI 2.0 interface (the Kepler cards are HDMI 1.4).

So, what’s the trick? The answer to that is chroma subsampling. Rather than using the standard YCbCr 4:4:4 or RGB sampling, Nvidia set the system to use a 4:2:0 sample. As such, the image retains the same brightness information, but the colour information is reduced by merging the information of four pixels. Four times less colour information means the signal is a lot smaller. As a result, Nvidia has managed to squeeze an entire 4K 60Hz signal over the 8.2 Gb/s bandwidth available to HDMI 1.4 interfaces.


Of course, this does come at a cost. Take a look at the image above. On the left, we've got a 4:4:4 sample with all of the colour and brightness information. On the right, we have the 4:2:0 sample with only a quarter of the colour information. Because all of the brightness information is still present, each pixel in the resulting image will appear different. However, because the colour resolution is cut in four, there will still be distortions. This isn't as noticeable with moving pictures, but too much information will be lost when it comes to something like text. Text won’t appear clearly, so using this for a normal desktop environment will be awful because a lot of colours will appear very distorted.

YCbCr is a way of transmitting images where the luminance (Y) and the red-difference and blue-difference chromas (CbCr) are transmitted, as you can see in the image below. This in itself saves a lot of bandwidth, and because the human eye is a lot more sensitive to different brightness levels, you won't notice a difference when using 4:4:4 samples. However, when a 4:2:0 sample is used, significantly less colour information is transmitted. Using the two images provided (and maybe some imagination), you can see why that's not that big a deal for video (a lot of video files are even encoded with the 4:2:0 preset in order to reduce the file size simply because you'll hardly notice a difference), but it will be for text.

Despite the loss in quality, this is a fairly effective solution for those in need. If you run the desktop at 30 Hz you’ll keep the real estate and colour accuracy, while running your games and videos at 60 Hz will mean you’ll keep the smoothness and gain 4K sharpness. It won't be perfect, but it'll be a step in the right direction.

Alternatively you can also use DisplayPort, as DisplayPort 1.2 supports a 3840x2160 resolution at 60 Hz. Nvidia's solution really should only be considered if the display you’re using does not have DisplayPort support, as some TVs do.

Follow Niels Broekhuijsen @NBroekhuijsen. Follow us @tomshardware, on Facebook and on Google+.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 20 Hide
    derekullo , June 23, 2014 9:36 AM
    And so the moral of the story is don't buy a 4k TV / Monitor without display port.
Other Comments
  • 0 Hide
    Quarkzquarkz , June 23, 2014 8:25 AM
    @WiseCracker, no not 75%. Keep in mind compression loss and resolution distortion are two completely different things. In this case, the sample packets are down from 4:4 to about half so you're essentially looking at a very poor 1440p. And not even 2k at that, you would expect a range between 1k and 2k and no one respects downsampling at any rate.
  • Display all 26 comments.
  • 0 Hide
    elbert , June 23, 2014 8:36 AM
    Its more a 1k color with 4k shadows so its a bit better than 1k total.
  • 20 Hide
    derekullo , June 23, 2014 9:36 AM
    And so the moral of the story is don't buy a 4k TV / Monitor without display port.
  • -8 Hide
    Wisecracker , June 23, 2014 10:11 AM

    :lol:  36 pixels 'compressed' to 9 pixels = 4:1 = 75% reduction

    Feel free to rationalize that any way you wish in your defense of pseudo "Voodoo 4k"


  • 7 Hide
    DarkSable , June 23, 2014 11:09 AM
    Quote:

    :lol:  36 pixels 'compressed' to 9 pixels = 4:1 = 75% reduction

    Feel free to rationalize that any way you wish in your defense of pseudo "Voodoo 4k"


    If you had read his post you would have seen he was simply correcting your misinformation, not trying to defend anything.
  • 2 Hide
    Ikepuska , June 23, 2014 11:16 AM
    I object to the use of "a lot of video files are even encoded with the 4:2:0 preset in order to reduce the file size". In fact the use of chroma subsampling is a STANDARD and MOST video files including the ones on your commercially produced Blu-ray movies were encoded using it.

    This is actually a really useful corner case for things like HTPCs or if you have a 50ft HDMI to your TV or projector, because there really is no loss of fidelity. But for desktop use it's just a gimmick.
  • 1 Hide
    matt_b , June 23, 2014 11:23 AM
    Well, at least on the surface Nvidia has a superior marketing claim here; no doubt that's all they care about anyway. Just let display port and HDMI v2.0 take over and do it right, no since in milking the old standard that can't.
  • 9 Hide
    Keyrock42 , June 23, 2014 11:36 AM
    Anyone hooking up a 4K TV or monitor really should do their research and make sure it has a display port input (why some 4k TVs or monitors are even manufactured without a display port input is beyond me). That said, it's nice that there is at least a dirty hack like this available for those that need to connect to a 4K TV/Monitor via HDMI. It's far from ideal, but better than nothing, I guess.
  • -3 Hide
    thundervore , June 23, 2014 11:52 AM
    Quote:
    And so the moral of the story is don't buy a 4k TV / Monitor without display port.



    If using a computer monitor then yes, Display Port wins here at 4K but if connecting to a 4K TV then using Display Port is not an option as i have yet to see a TV with Display port.

    edit:
    Looks like they are making TVs with Dispaly port after all. Didnt think this was going to happen any time soon.
    http://www.panasonic.com/au/consumer/tvs-projectors/led-lcd-tvs/th-l65wt600a.html
  • 0 Hide
    sha7bot , June 23, 2014 1:37 PM
    4:20 compression, eh? No wonder there's a loss in visual quality.
  • 0 Hide
    hannibal , June 23, 2014 1:56 PM
    This is good for videofilms, not so good to the text based material. Its is useable alternative to some material and does not reguire new hardware. But yeah display port is much better!
  • 0 Hide
    CaptainTom , June 23, 2014 4:27 PM
    Idk I would try to find a different compression method. Color is very important, and I personally don't think 4K is enough of an upgrade over my color-accurate IPS 1080p monitor. It actually felt like a decent upgrade over my old LED TV due to the vibrant colors and black levels alone...
  • 0 Hide
    Blazer1985 , June 23, 2014 6:15 PM
    It is not about color precision, it is about color compression as someone noted. Since human eye is capable of noticing luma changes between 4 pixels but not so much for the color information. Proof is that you'll get 4:4:4 10bit only (almost) on cinema screens while the tv - documentaries are filmed in 4:2:2 8bit at most and anyway you see them (even movies) after a 4:2:0 compression. I could post some links about it but they would be so boring :-D
  • 2 Hide
    TechyInAZ , June 23, 2014 8:30 PM
    If that's the case, then is there a way to push 120hz at 1080P? I would much rather have a higher refresh rate than a higher res.
  • 3 Hide
    hannibal , June 23, 2014 10:59 PM
    One extreme example of compression...
    http://i.imgur.com/RY3YrFn.png
  • 0 Hide
    wuzelwazel , June 23, 2014 11:40 PM
    I think most of this has been covered already but it's important enough to mention again.

    The chroma in a video signal is far less important than the luma. Human vision is much much more sensitive to changes in brightness than changes in color. In addition there is no loss of color depth; only a loss in the resolution of the least important part of the signal. Also, the effective resolution of the chroma at 4:2:0 sampling on a 4K display is 1920x1080 which is by no means low resolution.

    Of course 4:4:4 would be the best option but I'd call 4:2:0 a no-brainer to allow double the refresh rate for some users.
  • 0 Hide
    Draven35 , June 24, 2014 1:05 AM
    if you have black text on a white background, or vice versa, it will look fine in 4:2:0 since there is a luma sample for every pixel. It is when either the foreground or background are not black and white that reading text becomes a problem.
  • 0 Hide
    SteelCity1981 , June 24, 2014 2:56 AM
    it's all just a marketing ploy by NVidia to say we were able to push 4k at 60mhz out first on our gpu even though it's really a half ass way of doing it on 1.4. until hdmi 2.0 comes out on gpus then i'll pass.
  • -1 Hide
    nottorp , June 24, 2014 3:55 AM
    I think they can do better! Let's go back to the 16 EGA colors! Then we'll be able to have 16K displays over HDMI 1.0!
Display more comments