Sign in with
Sign up | Sign in
Your question

hooking up 1080p tv to video card

Last response: in Graphics & Displays
Share
March 26, 2008 12:31:05 PM

my friend has a zotac amp! 8800GTS and he was wonder if his video card supported 1080p. His package did not come with a DVI->HDMI adaptor like my 3870x2 did. His came with a composite cable instead. I am assuming his only option is to connect his HDTV to the composite cable? Is 1080p even possible with composite cable? I cannot actually find any specs that the video card supports 1080p tv-out. Can anyone confirm this?

my 3870x2 has a DVI->HDMI so I just plug that to a 1080p tv and good to go? I am 100% sure that the videocard supports 1080p.

I think they should call 1080p SuperHighDefinitionTV or something. I find this very confusing.
March 26, 2008 1:05:19 PM

A DVI->HDMI converter should work. They are identical signals (electrically) in a different interconnect. I'm using an DVI->HDMI adapter for my computer (Athlon XP 2400+, 1GB, FX5200) to display 1920x1080 on my Panasonic TH-42PZ700U (42" 1080p Plasma).

1080p is considered "Full HD", BTW. I know the naming convention sucks, but it's already set in stone pretty much, so we just need to deal with how stupid the naming is.
March 26, 2008 1:54:14 PM

His probably came with "component" cables, which are reg, green, blue colored (composite is yellow with white/red for audio), and that can run 1080i and possible 1080p depending on the card and TV capabilities. If he wants to use DVI or HDMI because of HDCP, then do what ^^^^^^ (person above) mentioned, get a $5 DVI->HDMI converter or a DVI->HDMI cable.

Something like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E168...


OR

http://www.newegg.com/Product/Product.aspx?Item=N82E168...


For the adapter: Make sure the DVI=male and HDMI=female
For the cable: Make sure the DVI and HDMI connectors are male

Good luck
Related resources
March 26, 2008 2:28:31 PM

Component is analog, and therefore does not look nearly as clear at an HDMI input (from personal experience).
March 26, 2008 2:36:39 PM

DVI to HDMI is a pure digital interface, the only difference is that HDMI to HDMI also includes digital audio.
DVI= Digital Video Interface.

DVI to HDMI is how I connect my video card to my HDTV and it looks fantastic.
March 26, 2008 2:48:48 PM

The TV might have a VGA or DVI input which would be a great way to hook up the pc to the TV. This is the way I do it, and I really like using the PC to watch video files because it upscales them very well, much better than the my roommates xbox 360 or a regular DVD player.
March 26, 2008 3:37:04 PM

only the ATI HD 3800s series have a true "HDMI" converter (Integrated HD audio controller with multi-channel (5.1) AC3 support and 1080p display). Nvidia can probably only do the 1080p display without the 5.1 audio.

However, if you use a VGA to DVI converter or vice versa. You will have a resolution around 1360x768, but not 1920x1080. I haven't tried composite cables yet, but it worked for my friend's plasma tv at 1080p.

p.s. i have a ATI HD 3850 videocard with HDMI hooked up to a 52 inch sharp aquos widescreen LCD TV.
a b U Graphics card
March 26, 2008 3:41:56 PM

stoner133 said:
DVI to HDMI is a pure digital interface, the only difference is that HDMI to HDMI also includes digital audio.


DVI->HDMI can include audio too, as demoed by the ATi HD series, just not on most GF8/9 series cards.

a b U Graphics card
March 26, 2008 3:47:23 PM

aylafan said:

However, if you use a VGA to DVI converter or vice versa. You will have a resolution around 1436x736, but not 1080p. I haven't tried composite cables yet, but it worked for my friend's plasma tv at 1080p.


Neither DVI or VGA are limited to that resolution (don't even know where that weird resolution comes from [most are either 1280x720 or 1366x768, 1436x736 is off ratio neither 16:10 or 16:9 and not even 1.85:1]).

It's probably a default setting of his monitor or card if anything.

HDCP requirement wouldn't down-convert to that either.
March 26, 2008 3:56:19 PM

TheGreatGrapeApe said:
Neither DVI or VGA are limited to that resolution (don't even know where that weird resolution comes from [most are either 1280x720 or 1366x768, 1436x736 is off ratio neither 16:10 or 16:9 and not even 1.85:1]).

It's probably a default setting of his monitor or card if anything.

HDCP requirement wouldn't down-convert to that either.


okay, to make you happy. i tested the DVI to VGA converter and i got 1360x768, but that is the highest resolution i can achieve. i was just too lazy to list the exact resolution and test out the converter. so, i do believe you are limited by DVI and VGA.

i have the hardware to test this out. do you?
a b U Graphics card
March 26, 2008 4:09:13 PM

Sounds like something to do-with or wrong-with the interface/converter, because I can out put from both to 1080P on various displays, either that or an artificial driver limitation.
It's not the output itself, it may however be the drivers however, as detailed by Cleeve in his review, the drivers on the HD2400P defaulted to a 1440x? resolution, but it's not a limitation of the interfaces;
http://www.tomshardware.com/2007/10/26/avivo_hd_vs_pure...
March 26, 2008 4:14:30 PM

TheGreatGrapeApe said:
Sounds like something to do-with or wrong-with the interface/converter, because I can out put from both to 1080P on various displays, either that or an artificial driver limitation.
It's not the output itself, it may however be the drivers however, as detailed by Cleeve in his review, the drivers on the HD2400P defaulted to a 1440x? resolution, but it's not a limitation of the interfaces;
http://www.tomshardware.com/2007/10/26/avivo_hd_vs_pure...


i don't think it's the drivers. i'm using the latest ATI Catalyst drivers 8.3 and i have a ATI HD 3850, not a low-end budget card ATI HD 2400.

it could be my TV's limitations for VGA input, but HDMI input can display a resolution of 1920x1080 for me. however, there is no DVI limitation. just VGA i believe.
a b U Graphics card
March 26, 2008 4:15:11 PM

OK you completely changed your post, so here's a second reply, the reply above was to your original post.

1802303,10,62159 said:
okay, to make you happy. i tested the DVI to VGA converter and i got 1360x768, but that is the highest resolution i can achieve. i was just too lazy to list the exact resolution and test out the converter. so, i do believe you are limited by DVI and VGA.

No VGA and DVI do not limit them on their own PERIOD.
You may be limited by your drivers or by your Panel's features or by whatever 'converter' you're suing, but DVI and VGA can both output full resolution 1080P, especially @ 30 frames per second (reduced blanking required for higher colour DVI @ 60frame), and if dual-link DVI then both can also display at 60fps full colour too.

It's not the interface itself it's your setup that's limiting the resolution.
[/quote]
a b U Graphics card
March 26, 2008 4:22:33 PM

aylafan said:

It could be my TV's limitations for DVI and VGA input, but HDMI input can display a resolution of 1920x1080 for me.


Yeah, and that could be the case too, as it's not the output that's the limiting factor, but the TV may treat all PC inputs the same based on how they achieve their '1080P' image.

If it's displaying at 1366x768 it's probably only a 720P panel like the Aquos 43U, otherwise they're messing with your image quality twice, once by limiting your resolution, and then again by interpolating from the down-converted resolution to the full 1080P.

Either way that would suck. Sounds like a non 1920x1080 monitor though if it's pushing you to 720P.
March 26, 2008 4:23:18 PM

TheGreatGrapeApe said:
OK you completely changed your post, so here's a second reply, the reply above was to your original post.

1802303,10,62159 said:
okay, to make you happy. i tested the DVI to VGA converter and i got 1360x768, but that is the highest resolution i can achieve. i was just too lazy to list the exact resolution and test out the converter. so, i do believe you are limited by DVI and VGA.

No VGA and DVI do not limit them on their own PERIOD.
You may be limited by your drivers or by your Panel's features or by whatever 'converter' you're suing, but DVI and VGA can both output full resolution 1080P, especially @ 30 frames per second (reduced blanking required for higher colour DVI @ 60frame), and if dual-link DVI then both can also display at 60fps full colour too.

It's not the interface itself it's your setup that's limiting the resolution.
[/quote]
said:


okay. here are my specifications.
52 inch sharp aquos widescreen LCD TV
Sapphire ATI HD 3850 + latest ATI Cataylst drivers 8.3
DVI to VGA converter + VGA to VGA cable --------> 1360x768 highest
HDMI cable ----------> 1920x1080 highest

i'm testing this right now man... i meant to say there is VGA limitation, not DVI.


March 26, 2008 4:26:06 PM

TheGreatGrapeApe said:
Yeah, and that could be the case too, as it's not the output that's the limiting factor, but the TV may treat all PC inputs the same based on how they achieve their '1080P' image.

If it's displaying at 1366x768 it's probably only a 720P panel, otherwise they're messing with your image quality twice, once by limiting your resolution, and then again by interpolating from the down-converted resolution to the full 1080P.

Either way that would suck. Sounds like a non 1920x1080 monitor though if it's pushing you to 720P.


ummm.. no. it only pushes me to 1360x768 when i use the DVI to VGA converter.

works perfectly fine with HDMI cable at 1920x1080.

i have one of the newest Sharp Aquos LCD TV models. you can search it online. LC-52D64U http://www.sharpusa.com/products/ModelLanding/0,1058,1920,00.html
March 26, 2008 4:26:40 PM

I know my tv (Panasonic TH-42PZ700U 1080p) is perfect if you use the DVI at 1920x1080. The limitation of the screen is 1280x1024 if you use the 15-pin VGA connector. This is a limitation of the SCREEN, not any other hardware. That being said, I'm going to side with GGA, not only because of personal experience, but because I believe he knows his stuff.
Edit:
aylafan said:
i meant to say there is VGA limitation, not DVI.

Only if you mean the VGA input of the screen, but not the VGA interface itself. It's (the VGA specification) limitation is 2048x1536.
a b U Graphics card
March 26, 2008 4:29:43 PM

Still no VGA limit though VGA goes HIGHER than single link DVI VGA goes almost as high as dual-link DVI. VGA running off of 400MHz RAMDACs like found on pretty much all cards since the Radeon 8500 and GF6 series (some FX not all) will output 2048 x 1536 @ 75 Hz - 32 Bit Color or 25x16 @ 60fps /32.

I have a feeling it's takinng a higher input on HDMI because it's set to downconvert automatically (like my projector which prefer 1080P rather than the native 720P) because it prefers more info in converting to 736P.
March 26, 2008 4:36:04 PM

TheGreatGrapeApe said:
Still no VGA limit though VGA goes HIGHER than single link DVI VGA goes almost as high as dual-link DVI. VGA running off of 400MHz RAMDACs like found on pretty much all cards since the Radeon 8500 and GF6 series (some FX not all) will output 2048 x 1536 @ 75 Hz - 32 Bit Color or 25x16 @ 60fps /32.

I have a feeling it's takinng a higher input on HDMI because it's set to downconvert automatically (like my projector which prefer 1080P rather than the native 720P) because it prefers more info in converting to 736P.


okay you guys are probably right. maybe my hardware just don't like each other, but i'm just telling you guys this because i don't want the person who posted this topic to feel like he got cheated if he used a DVI to VGA output instead of HDMI output and ended up with a lower resolution.

but back to the main topic..

i think component cables should work fine at 1920x1080.
Geforce cards can probably display 1080p, but with no audio through HDMI.
ATI 3800 series can display 1080p and output audio though HDMI.

like the guys mentioned before. it all depends on the setup.

thanks for making this clear for me. i never did understand why the DVI to VGA output had a lower resolution than HDMI output for me.
March 26, 2008 4:48:32 PM

aylafan said:
okay you guys are probably right. maybe my hardware just don't like each other, but i'm just telling you guys this because i don't want the person who posted this topic to feel like he got cheated if he used a DVI to VGA output instead of HDMI output and ended up with a lower resolution.

but i think component cables should work fine at 1920x1080.
Geforce cards can probably display 1080p, but with no audio through HDMI.
ATI 3800 series can display 1080p and output audio though HDMI.


I don't think component has the bandwidth for 1080p, only 1080i ?

I haven't gotten component to do 1080p on from a PC or PS3. The again it could be a limitation of whatever chip they use for component output rather than a cable bandwidth issue.

I am using VGA to run 1080p to my LCD TV because the video drops to 1360x768 if I use a DVI to HDMI adaptor. No idea if its my TV or video driver limiting it.
a b U Graphics card
March 26, 2008 4:50:05 PM

Looking at their specs, it is a full 1080 panel (no mention of where/why/how PC support though), so it's screwed up that they're limiting the VGA input to a non-native resolution, it makes no sense other than for them to try and avoid singal noise over longer cables, but to me for such a panel I would hope the mfr wouldn't dumb things down just to help the dumb users and put it to the lowest common denominator.

Oh well, weird property of the panel, but it's not the VGA it's the panel that's limiting it in that case. I'm pushing 1920x1440 @ 75hz right now on my P260 at work over both DVI-A and VGA/DB-15.
March 26, 2008 4:50:11 PM

I use a DVI to HDMI converter that came with my Radeon 3870 on my 8800GTX and it works perfectly for 1080p.
March 26, 2008 4:54:05 PM

i guess it all depends on the TV screen, videocard, and the type of input you use. seems like everyone has different experiences with different setups. HDMI, VGA, DVI, and component cables.

i paid $2200 for this LCD TV and it won't do VGA at 1080p, but will do HDMI at 1080p just fine. sighs.
a b U Graphics card
March 26, 2008 5:06:09 PM

vic20 said:
I don't think component has the bandwidth for 1080p, only 1080i ?


Thats correct depending on how it's used, as most component out is YPbPr, and the norm for those TVs is 1080i due to signal quality over longer lengths.
You could push 1080P over YPbPr component, but it will require a TV that supports it (similar to the issue aylafan has with VGA) and then also require good cable quality, because even at 6ft it is very suceptible to line noise.
The most commonly used way to push it (hi res/bandwidth) over component is to use the RGB format which has higher tolerances and doesn't rely on timing differences like the YPbPr, and it essentially mimics BNC connections, just using RCA jacks. There really isn't a limit to these setups, and they can theoreticall under spec carry more info than HDMI 1.3 deep colour (edit - which is a known spec, the cable is limited aroudn that, not sure the HDMI wire spec though), but you need the support from the panel (and output of course).

That's the biggest problem with these things, the physical and spec limitations aren't always the limiting factor, it's the artificially imposed stuff, like TV support, like driver support, app support, crappy DRM, etc that usually limits this.

For HD though the biggest limiting factor (other than the artificial ones) is single-link DVI and even that usually isn't an issue since most of the TVs that accept DVI input allow for reduced blanking as well.

Anywhooo, main thing for most people is check the specs and support for your TV, you are less likely to be limited by your PC than your TV, althugh annoying things like the overlay changes (mainly for windowed/dual screens), the pixel scaling issue, and overscan are often a driver/PC limit, and so if those come into play (more for off resolution TVs) then you should look into that too.
March 26, 2008 5:13:18 PM

Um I use a DVI to VGA adapter from a 6800GT to a 42" Samsung LCD and I have no problem displaying 1920x1080. I don't know why you would say that's not possible.
March 26, 2008 7:56:10 PM

Honestly, even if you can do 1080p through component it won't look that great. If you spent the money for a good mid/high end computer and a 1080p panel, it only makes sense to get the most out of it (HDMI, regardless of audio).
a b U Graphics card
March 26, 2008 8:27:47 PM

Yeah for the most part HDMI/DVI is the way to go, but as long as you're not scrimping on parts eventhe component output should be very close to the same.

Alot of it really comes down to the weakest link, and if you send HDMI or DVI over more than 15ft you start getting into issues as well, and personally I wouldn't mind the occasional analogue noise versus the starfield noise of a digital connection. However under 15ft even poor quality HDMI cable usually will give you a clean image, something that can be a problem with anything over 10ft with mediocre analogue cables.
March 26, 2008 9:35:47 PM

does zotac amp! 8800GTS support 1080p thru HDMI\DVI if i buy the connector? I cant seem to find an answer to that on the internet.
a b U Graphics card
March 26, 2008 9:58:38 PM

It should. And this is the problem, with the number of TV setups out there it may still give you trouble, but it's not the interface that's the issue.

I have seen people complain about 1080P and 1080i TVs not working on every card since the R9xxx GFFX series. There's no reason why it shouldn't work, but as I mentioned before you may have the niggling issues everyone gets like overscan etc.

Funny thing is my HD2600 works like a charm 99.44% of the time, and then I plug it in at one location (work, GF's, etc.) and then return home and plug it into the 1080i TV and voila, can't get it to work right away, trouble detecting, etc. Reboot, doesn't work. Then change resolution, still doesn't work, then reboot, and voila, back to normal.

PC to TV is just annoying mainly because they don't follow the same standards and there are so many variables. But that being said, sometimes on my TV my HDMI connection or component connection won't properly recognize the HD/BluRay player or even the progressive DVD player and I get a Blue screen or weak signal until I turn on/off the TV, and those are supposed to be dumb industry standard parts. So it's not just PCs / Graphics Cards.

Anywhoo, it should have no trouble supporting that resolution through a DVI->HDMI dongle, but you may need to tinker and tweak to get it to work the way you want with your TV. Based on most people's feedback you may find yourself tweaking the advanced TV settings once you get it setup.
March 27, 2008 2:10:27 AM

1802293,7,62159 said:
only the ATI HD 3800s series have a true "HDMI" converter (Integrated HD audio controller with multi-channel (5.1) AC3 support and 1080p display). Nvidia can probably only do the 1080p display without the 5.1 audio. quotemsg]


I believe all HD series cards have the AC3 sound. I'm using a 2600xt in my HTPC and it has the HD audio chip.

I am also using a Sharp Aquos 46" (LC46D64U) with the supplied DVI-HDMI adaptor and 1080P works great. I had the same card on a 57" rear-projection, 1080i set and because I had to use a DVI-component adaptor I was limited to 1366x768. I assumed it was the adaptor.
March 27, 2008 2:45:07 AM

firebird said:
1802293,7,62159 said:
only the ATI HD 3800s series have a true "HDMI" converter (Integrated HD audio controller with multi-channel (5.1) AC3 support and 1080p display). Nvidia can probably only do the 1080p display without the 5.1 audio. quotemsg]


I believe all HD series cards have the AC3 sound. I'm using a 2600xt in my HTPC and it has the HD audio chip.

I am also using a Sharp Aquos 46" (LC46D64U) with the supplied DVI-HDMI adaptor and 1080P works great. I had the same card on a 57" rear-projection, 1080i set and because I had to use a DVI-component adaptor I was limited to 1366x768. I assumed it was the adaptor.
said:


yeah. sorry. that's what i meant. ATI HD series, not just 3800s series. it's just that i read somewhere on Tom's Hardware that one of the ATI HD 2XXX series weren't displaying 1080p properly. not sure which videocard it was.
June 4, 2009 9:24:43 AM

aylafan,

I want to build HTPC with ATI Radeon HD 4870. You mention that you use Sharp LCD. When you set your video card to 1920X1080, It will show full screen without black frame (underscan) at the side or without any picture crop at the side (overscan). I will use the HTPC with my sharp LCD 46 inch (1080p).
July 6, 2009 2:07:31 AM

aylafan said:
okay. here are my specifications.
52 inch sharp aquos widescreen LCD TV
Sapphire ATI HD 3850 + latest ATI Cataylst drivers 8.3
DVI to VGA converter + VGA to VGA cable --------> 1360x768 highest
HDMI cable ----------> 1920x1080 highest

i'm testing this right now man... i meant to say there is VGA limitation, not DVI.



Your limitations are based on the amount of video memory your display card has. It has to be able to matc h the scan rate the display device is using at any given resolution. Modern HD TVs scan rates are much higher than most display cards put out unless they have specific inputs designed for VGA, These have internal Scan Converters to make the input display properly (ever notice how few TVs have more than one VGA input?). Super high end display cards with decent HDMI outputs are designed for proper input to the TV's circuitry. You could say it's the drivers, however the bottom line is the drivers only operate the needed hardware, and without sufficent video memory, and high speed video processors the drivers cant do a thing.
!