Sign in with
Sign up | Sign in
Your question
Closed

How Has Nvidia Managed to Push 60Hz 4K Over HDMI 1.4?

Tags:
  • Nvidia
  • Graphics Cards
  • Components
  • Graphics
Last response: in News comments
Share
a b U Graphics card
June 23, 2014 7:31:32 AM


So ...
nVidia is compressing the 4k packets 75% ? Wouldn't that be a 1K signal ??

:lol: 

Kinda defeats the purpose of 4K, doesn't it?



Score
-10
June 23, 2014 8:25:36 AM

@WiseCracker, no not 75%. Keep in mind compression loss and resolution distortion are two completely different things. In this case, the sample packets are down from 4:4 to about half so you're essentially looking at a very poor 1440p. And not even 2k at that, you would expect a range between 1k and 2k and no one respects downsampling at any rate.
Score
0
June 23, 2014 8:36:56 AM

Its more a 1k color with 4k shadows so its a bit better than 1k total.
Score
0
June 23, 2014 9:36:03 AM

And so the moral of the story is don't buy a 4k TV / Monitor without display port.
Score
20
a b U Graphics card
June 23, 2014 10:11:26 AM


:lol:  36 pixels 'compressed' to 9 pixels = 4:1 = 75% reduction

Feel free to rationalize that any way you wish in your defense of pseudo "Voodoo 4k"


Score
-8
a b Î Nvidia
a c 91 U Graphics card
June 23, 2014 11:09:43 AM

Quote:

:lol:  36 pixels 'compressed' to 9 pixels = 4:1 = 75% reduction

Feel free to rationalize that any way you wish in your defense of pseudo "Voodoo 4k"


If you had read his post you would have seen he was simply correcting your misinformation, not trying to defend anything.
Score
7
June 23, 2014 11:16:49 AM

I object to the use of "a lot of video files are even encoded with the 4:2:0 preset in order to reduce the file size". In fact the use of chroma subsampling is a STANDARD and MOST video files including the ones on your commercially produced Blu-ray movies were encoded using it.

This is actually a really useful corner case for things like HTPCs or if you have a 50ft HDMI to your TV or projector, because there really is no loss of fidelity. But for desktop use it's just a gimmick.
Score
2
a b U Graphics card
June 23, 2014 11:23:39 AM

Well, at least on the surface Nvidia has a superior marketing claim here; no doubt that's all they care about anyway. Just let display port and HDMI v2.0 take over and do it right, no since in milking the old standard that can't.
Score
1
June 23, 2014 11:36:15 AM

Anyone hooking up a 4K TV or monitor really should do their research and make sure it has a display port input (why some 4k TVs or monitors are even manufactured without a display port input is beyond me). That said, it's nice that there is at least a dirty hack like this available for those that need to connect to a 4K TV/Monitor via HDMI. It's far from ideal, but better than nothing, I guess.
Score
9
June 23, 2014 11:52:01 AM

Quote:
And so the moral of the story is don't buy a 4k TV / Monitor without display port.



If using a computer monitor then yes, Display Port wins here at 4K but if connecting to a 4K TV then using Display Port is not an option as i have yet to see a TV with Display port.

edit:
Looks like they are making TVs with Dispaly port after all. Didnt think this was going to happen any time soon.
http://www.panasonic.com/au/consumer/tvs-projectors/led...
Score
-3
June 23, 2014 1:37:01 PM

4:20 compression, eh? No wonder there's a loss in visual quality.
Score
0
June 23, 2014 1:56:11 PM

This is good for videofilms, not so good to the text based material. Its is useable alternative to some material and does not reguire new hardware. But yeah display port is much better!
Score
0
a b U Graphics card
June 23, 2014 4:27:18 PM

Idk I would try to find a different compression method. Color is very important, and I personally don't think 4K is enough of an upgrade over my color-accurate IPS 1080p monitor. It actually felt like a decent upgrade over my old LED TV due to the vibrant colors and black levels alone...
Score
0
June 23, 2014 6:15:21 PM

It is not about color precision, it is about color compression as someone noted. Since human eye is capable of noticing luma changes between 4 pixels but not so much for the color information. Proof is that you'll get 4:4:4 10bit only (almost) on cinema screens while the tv - documentaries are filmed in 4:2:2 8bit at most and anyway you see them (even movies) after a 4:2:0 compression. I could post some links about it but they would be so boring :-D
Score
0
a b Î Nvidia
a c 84 U Graphics card
June 23, 2014 8:30:22 PM

If that's the case, then is there a way to push 120hz at 1080P? I would much rather have a higher refresh rate than a higher res.
Score
2
June 23, 2014 11:40:08 PM

I think most of this has been covered already but it's important enough to mention again.

The chroma in a video signal is far less important than the luma. Human vision is much much more sensitive to changes in brightness than changes in color. In addition there is no loss of color depth; only a loss in the resolution of the least important part of the signal. Also, the effective resolution of the chroma at 4:2:0 sampling on a 4K display is 1920x1080 which is by no means low resolution.

Of course 4:4:4 would be the best option but I'd call 4:2:0 a no-brainer to allow double the refresh rate for some users.
Score
0
June 24, 2014 1:05:03 AM

if you have black text on a white background, or vice versa, it will look fine in 4:2:0 since there is a luma sample for every pixel. It is when either the foreground or background are not black and white that reading text becomes a problem.
Score
0
June 24, 2014 2:56:20 AM

it's all just a marketing ploy by NVidia to say we were able to push 4k at 60mhz out first on our gpu even though it's really a half ass way of doing it on 1.4. until hdmi 2.0 comes out on gpus then i'll pass.
Score
0
June 24, 2014 3:55:56 AM

I think they can do better! Let's go back to the 16 EGA colors! Then we'll be able to have 16K displays over HDMI 1.0!
Score
-1
June 24, 2014 5:08:17 AM

gaming at 4:2:0 would lower image quality to below console levels. 4:2:0 is good for video and video only. also 4:2:0 breaks subpixel text displaying like cleartype.
Score
0
June 24, 2014 9:56:57 AM

Quote:
I think they can do better! Let's go back to the 16 EGA colors! Then we'll be able to have 16K displays over HDMI 1.0!


They could call it SUVGA Super Ultra VGA

Score
1
June 25, 2014 1:38:38 AM

Saving money for a nice 1080p gsync monitor when the timing is right.
Score
0
June 25, 2014 2:26:33 AM

Almost all videos use 4:2:0 (including DVD, Blu-Ray, online video sites, video cameras, TV/Cable etc) with the exception of DV videos (think old mini-DV tape cameras) which uses 4:1:1. Usually only high end professional videos use 4:4:4, but are later exported in 4:2:0
Score
0
June 25, 2014 5:18:21 AM

PAL DV uses a version of 4:2:0- the Cr and Cb samples are offset from the root chroma sample.

Many professional camera use 4:2:2, only the high end ones use 4:2:0. A lot of news footage is shot at 4:2:0 (unless its SD, in which case a lot of it is 4:1:1)
Score
0
a b Î Nvidia
a c 91 U Graphics card
July 6, 2014 10:30:01 AM

fuzzion said:
Saving money for a nice 1080p gsync monitor when the timing is right.


Save up even further and get the Asus ROG SWIFT. 1440p, 144Hz, gsync, 8-bit color, the list goes on and on...
Score
0
!