4k Graphics card for 50" 4KUHD TV as monitor with high resolution

Kyle_Foster

Reputable
Dec 15, 2015
3
0
4,510
I just upgraded from a 50" UHD TV to a 60" and I want to use the 4KUHD 50" TV as my main monitor on my development/programming desktop PC. I currently use 3 - 22" monitors, but I want to use the TV to replace all 3 so I need a card with the highest resolution possible. Quality is more important than the price, but I am not a gamer. NVidia's site looks to be mainly for gamers.
 

Kyle_Foster

Reputable
Dec 15, 2015
3
0
4,510
Of course the quality would be nice, but I can handle a little degredation since 4K hasn't taken off, but I want the high resolution so I can see my development environment, sql management studio, sql monitor and my products main app at the same time for ease of development. If the fps slows down but I can still see the fonts then all is well. The build in video adapter will not allow more than 2048 by xxxx. It needs to go 3 times that high.
 

Dougiefresh181

Honorable
Dec 22, 2013
13
0
10,510
Rebuild with a haswell i3 mini itx. For $250, it'll do 4k out at 30fps with just integrated graphics. Currently using my i5 laptop to stream plex at 4k and its fine. Just don't try gaming at all
 

jdmkira

Reputable
Apr 12, 2015
31
0
4,530


Even if it's not for gaming, it's horrible. It will look like a 90s monitor in terms of refresh rate. Btw you don't need an external graphics card for that. Any 3rd gen intel processor can handle 4k @ 30hz
 

jdmkira

Reputable
Apr 12, 2015
31
0
4,530
No. Most commercially available monitors were locked at 30Hz (Some had weird refresh rates like 54. something Hz) and many had TONS of flickering. The general rule was that, at higher refresh rate, higher chance of having flicker and frame skipping issues.

Anyways, he may get any compatible card that has an hdmi, although I still wouldn't recommend 30Hz even for his non-gaming pc. I played a lil bit with my 4k monitor on displayport 1.1 before switching to 1.2 and it just doesn't feel right.
 

Gillerer

Distinguished
Sep 23, 2013
361
81
18,940
CRT monitors generally used 75 or 85 Hz modes in the usual resolutions of 1024x768 and 1280x1024. 85 Hz was really good, 75 Hz had some flickering.

Only NVIDIA currently has GPUs that have HDMI 2.0, which you'll want when you attach a UHD TV. It's less important when viewing movies etc., but crucial if you want to use the TV as a monitor. Of course if your TV doesn't support HDMI 2.0, it doesn't matter.

On NVIDIA's site, it doesn't list the HDMI version on GTX 750Ti's page but does list "2.0" on GTX 950's, so I guess GTX 950 is the cheapest you can get HDMI 2.0.
 

jdmkira

Reputable
Apr 12, 2015
31
0
4,530
Yeah, but those were high end monitors, and maybe only the most recent ones didn't had any issues. The 8514 IBM ones were pretty bad. I was poor so I had vga crt @30 hz