HDMI 2.0 or 1.4 Nvidia

Tensai30

Respectable
Jul 4, 2016
281
0
1,810
I'm using a gtx 1070 and a 43 inch sharp UHDTV with HDMI 2.0 inputs. I purchased an HDMI cable specified as 1.4. I'm able to reach 3840x2160 at 60hz without any issues except I can't enable full color and RGB like I can at 30hz. I've read lots of conflicting information online. Some stating that 1.4 and 2.0 cables are exactly the same and buying 2.0 would be a waste of money and others stating that 2.0 specified cables can reach higher bandwidths of 18gbs where 1.4 can only reach 10gbs. I'm not sure which one to believe here. Would upgrading to an HDMI 2.0 cable allow me to reach full color or is it a limitation by my display device?
 
There's no such thing as a HDMI 2.0 cable. There's standard (10gbps) and high speed (18gbps). Some high speed were re-labelled 2.0 to pander to consumers obsession with 4k. All you need is a relatively good, short High Speed cable (which you probably had with your old 1080p monitor anyway) and any other issues will be related to GPU/monitor.

If you are getting 4k60Hz already another cable will not change anything. It may be that your monitor doesn't support full color 4k60Hz or your GPU doesn't output it.
 

Tensai30

Respectable
Jul 4, 2016
281
0
1,810
Seems like I might have answered my own question once again. This has been happening to me a lot recently on here. Let me try to explain it to anyone who might also be wondering this. There is no such thing as an HDMI 2.0 cable as 2.0 is only a way method of delivery and a lot of people seem to be making a big fuss about it confusing everyone. The difference is a lot of these "HDMI 2.0" cables are premium certified which means they can handle 18gb/s and support HDR along with other features which separates them from the 1.4. Also the reason I could hit 60hz at 4k is because of a trick Nvidia did to enable 4k 60HZ on 1.4 but reducing the color quality.
Of course anyone is welcome to come on here and correct me if I am wrong.
 

Tensai30

Respectable
Jul 4, 2016
281
0
1,810


The reason I am getting 60hz is because of a trick that Nvidia does.
http://www.tomshardware.com/news/nvidia-kepler-4k-hdmi-1.4,27117.html
 

Tensai30

Respectable
Jul 4, 2016
281
0
1,810


Thanks. I'm going to buy an 18gbs cable today I suspect my previous cable is only able to do 10.
 

atomicWAR

Glorious
Ambassador


Not true I am running that exact set-up right now over HDMI 2.0b (ie 18gbs cable)
 

Tensai30

Respectable
Jul 4, 2016
281
0
1,810


Even on Pascal?? I'm sorry I'm finding that hard to believe. Any source to back that up?
 


Yes because of the workaround. It isn't true 4:4:4. It is upscaled.

 

atomicWAR

Glorious
Ambassador


No the workaround is for hdmi 1.4 and 4k60hz (ie it drops from 4:4:4 to 4:2:0 to make the 60hz capable)...you are very wrong on this i am sorry. It has been supported since the GTX 900 series cards.
 

Tensai30

Respectable
Jul 4, 2016
281
0
1,810

Once again, source please.

 

Tensai30

Respectable
Jul 4, 2016
281
0
1,810

This is correct. The workaround was designed for Kepler cards

 
No the 1.4 workaround is for 60Hz via a 24Hz connection, not 4:4:4. It is a controversial subject and Nvidia will deny it, but I have been reading about it and somebody even got a rep to admit it ot them on the nvidia dev forum. I will try to find some sources for you.
 

atomicWAR

Glorious
Ambassador
So I did some more digging, actually have been since the debate on 4:4:4 4k60hz debate started. I did mis-state when hdmi 2.0 drops to 4:2:0 for 4K60hz at 12bit. At 8 bit (non hdr save a few faux hdr panels that fudge it over 8bit) 4:4:4 subsampling is supported for 4k60hz. Previously i said it was 12 bit when in fact its 10 bit that hdmi drops 4k60hz down to 4:2:0 unless you go to 30hz at which point 4:4:4 is supported at 4K. So we were both half right. i hope this clarifies the issue.

Note: my UHDTV is one of the 8bit fudge it "hdr" sets out there. I have seen it side by side to a true 10bit panel and the difference is barely visible unlike a non HDR set which is very noticeable difference. Regardless the point of Tom's is to have accurate advice so I wanted to this update for the OP and anyone reading this thread. Sorry for the confusion.

http://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx



 

atomicWAR

Glorious
Ambassador
yeah you need a 18gbs cable for the features you want. Sorry again for the mistake on 10bit and 12bit but as stated you can do 4:4:4 with 8bit 4K60hz. Regardless we and everyone else is stuck with it as the standard. Regardless i hope we were some help in this thread for you. enjoy your 4K PC gaming!!!