How Has Nvidia Managed to Push 60Hz 4K Over HDMI 1.4?

Normally, in order to run a 4K signal over HDMI at 60 Hz, you’d need an HDMI 2.0 connection. 'Normally,' isn’t really the right word though, as you cannot yet buy any graphics cards with an HDMI 2.0 output. However, it seems that the GeForce 340.43 Beta driver release had made it possible to push 4K at 60 Hz out of Kepler based cards, despite not having the HDMI 2.0 interface (the Kepler cards are HDMI 1.4).

So, what’s the trick? The answer to that is chroma subsampling. Rather than using the standard YCbCr 4:4:4 or RGB sampling, Nvidia set the system to use a 4:2:0 sample. As such, the image retains the same brightness information, but the colour information is reduced by merging the information of four pixels. Four times less colour information means the signal is a lot smaller. As a result, Nvidia has managed to squeeze an entire 4K 60Hz signal over the 8.2 Gb/s bandwidth available to HDMI 1.4 interfaces.


Of course, this does come at a cost. Take a look at the image above. On the left, we've got a 4:4:4 sample with all of the colour and brightness information. On the right, we have the 4:2:0 sample with only a quarter of the colour information. Because all of the brightness information is still present, each pixel in the resulting image will appear different. However, because the colour resolution is cut in four, there will still be distortions. This isn't as noticeable with moving pictures, but too much information will be lost when it comes to something like text. Text won’t appear clearly, so using this for a normal desktop environment will be awful because a lot of colours will appear very distorted.

YCbCr is a way of transmitting images where the luminance (Y) and the red-difference and blue-difference chromas (CbCr) are transmitted, as you can see in the image below. This in itself saves a lot of bandwidth, and because the human eye is a lot more sensitive to different brightness levels, you won't notice a difference when using 4:4:4 samples. However, when a 4:2:0 sample is used, significantly less colour information is transmitted. Using the two images provided (and maybe some imagination), you can see why that's not that big a deal for video (a lot of video files are even encoded with the 4:2:0 preset in order to reduce the file size simply because you'll hardly notice a difference), but it will be for text.

Despite the loss in quality, this is a fairly effective solution for those in need. If you run the desktop at 30 Hz you’ll keep the real estate and colour accuracy, while running your games and videos at 60 Hz will mean you’ll keep the smoothness and gain 4K sharpness. It won't be perfect, but it'll be a step in the right direction.

Alternatively you can also use DisplayPort, as DisplayPort 1.2 supports a 3840x2160 resolution at 60 Hz. Nvidia's solution really should only be considered if the display you’re using does not have DisplayPort support, as some TVs do.

Follow Niels Broekhuijsen @NBroekhuijsen. Follow us @tomshardware, on Facebook and on Google+.

Create a new thread in the US News comments forum about this subject
This thread is closed for comments
26 comments
    Your comment
    Top Comments
  • derekullo
    And so the moral of the story is don't buy a 4k TV / Monitor without display port.
    20
  • Other Comments
  • Wisecracker
    So ...
    nVidia is compressing the 4k packets 75% ? Wouldn't that be a 1K signal ??

    :lol:

    Kinda defeats the purpose of 4K, doesn't it?
    -10
  • Quarkzquarkz
    @WiseCracker, no not 75%. Keep in mind compression loss and resolution distortion are two completely different things. In this case, the sample packets are down from 4:4 to about half so you're essentially looking at a very poor 1440p. And not even 2k at that, you would expect a range between 1k and 2k and no one respects downsampling at any rate.
    0
  • elbert
    Its more a 1k color with 4k shadows so its a bit better than 1k total.
    0