DVI to HDMI picture cut off

Status
Not open for further replies.

thatonekid

Distinguished
Apr 1, 2007
8
0
18,510
I am having trouble getting the picture on my new LCD to look right. If I connect it via VGA everything looks fine. But if I connect via DVI out on my card to HDMI in on my LCD the edges are cut off on the display. I can just barely see the top of my windows task bar and that is about the same amount that is cut off from the top and sides as well. Is this a driver issue or an LCD issue?
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
Change the resolution to one that fits your LCD, you may also know that even if your LCD sayings its 1080i it sure doesn't run at 1920x1080
 

thatonekid

Distinguished
Apr 1, 2007
8
0
18,510
I have tried every resolution setting available. The resolution that looks the best (the same resolution as stated in LCD manual) still has the same issue.
 

billdcat4

Distinguished
Nov 14, 2006
1,178
0
19,280
I have tried every resolution setting available. The resolution that looks the best (the same resolution as stated in LCD manual) still has the same issue.

There are special programs to set resolutions for TVs. I didnt have one when trying to hook up to my Panasonic AE700U projector, and as a result it only sorta worked. Luckily for me though, ATI Catalyst supports 720p. Ive got to find a good program for that projector though, let me know if you find anything?
 

IcY18

Distinguished
May 1, 2006
1,277
0
19,280
What drivers and graphic card are you running. Like the guy above said his supports 720p. I also have the Catalyst drivers and have mine hooked up to LCD and must click the High Definition resolutions like 720p or 1080i rather than specifically picking 1280x720 or 1920x1080, if you have an nvidia card do you see any options like this?
 

thatonekid

Distinguished
Apr 1, 2007
8
0
18,510
I am using a XFX GeForce 7300 GT video card and a ASUS M2N4-SLi motherboard. I have the most recent driver updates for the motherboard from their web site. Is there a driver that I should be using for the video card other than what came with it? I do not see any options for "720p" or "1080i" in the resolution settings. The resolution that the TV recomends is 1366 x 768.
 

flabbergasted

Distinguished
Mar 1, 2006
113
0
18,690
Your LCD monitor specs are what determines what resolutions you can use. I have a BENQ 19" LCD and my max resolution is 1280 x 1024. To find out your max res right click on your desk top go "settings" and move the slide bar all the way to the right. I have a similar problem to yours, when I connect the monitor using the DVI cable I can't adjust the picture geometry using the controls on the monitor. They're grayed out and the task bar never completely gets hidden. Another strange thing is that fonts don't look that crisp either. When I connect the monitor using a DVI to VGA converter. I get geometry control and my fonts look crisper. I think unless you're going to watch DRM copywrited material that necessitates the use of an HDMI connect, I'd just forget about it. When watching non DRM HD h264 movies, who needs HDMI and garbola DRM?
 

mackerjack

Distinguished
Apr 23, 2007
2
0
18,510
I'm using a Samsung hdtv myself with a native resolution of 1366 * 768 running on an ati x1950xtx. The only time i've experienced what you describe was when my refresh rate was set to 50hz as opposed to 60hz. Might be worth a try on yours :wink: .
 

MCMONOPOLY

Distinguished
Jul 4, 2006
268
0
18,780
Like I've told many people here, you first have to check your TV's user manual to see if you can use a DVI->HDMI cable to connect your PC to your TV, since almost all new LCD TV's don't support that kind of connection, I know I was sorely disapointed when i found out that my brand new TV wouldn't display anything coming from any other connection than the VGA connector. (I suspect this to be in tune with the HDCP protocols that are being introduced by almost every vendor of LCD TVs out there). So if your manual states that you indeed CAN use a DVI->HDMI cable to connect, then it's only a matter of enabling "720p" or "1080i" in your card's control panel. ATI's CCC adds this function directly in the panel without having to fuss too much.

What kind of TV do you have?

Cheers.
 

MCMONOPOLY

Distinguished
Jul 4, 2006
268
0
18,780
Just to know, since I own a 32" Samsung myself (and can't in now way on earth connect any of my PCs to my TV using a DVI->HDMI cable), do you connect using the VGA cable or a DVI->HDMI cable? Because 1366x768 is not a "HDTV" resolution, it's a WS monitor resolution used for PC desktop monitors. The equivalent of 720p and 1080p compared to PC monitors resolutions would be either 1280x720 or 1900x1080....The only TV set I've seen that was able to use that kind of connection from PC to TV, was the 37" Westinghouse model which came with a native resolution of 1920x1080, and support for 1080p HD resolution, but it has no NTSC or ATSC integrated into it.

BTW I tried connecting with a DVI->HDMI cable on a 9600XT, a X850XT PE and a X1950Pro without any success in all cases. (But i'm not surprised about that since the manual clearly states that such connections aren't supported..).
 

Vash-HT

Distinguished
Jun 30, 2006
104
0
18,680
Im gonna have to disagree with a lot of stuff I have read in this thread. The only difference between a LCD monitor and TV is that a TV has a tuner in it, that's it. I have a Westinghouse 37" LCD montior, it still does 1080p and such. My friend has a 37" LCD 1080p Sceptre TV. Guess what, when I hook my PC up to it works exactly the same. I set my resolution on it to 1920x1080. LCD TV's run resolutions just like LCD montiros, theres no difference.

In response to the OP, its an ATI driver problem with using a DVI-HDMI converter. I saw the same problem on my friends TV because it doesnt have a DVI input. If I tried connecting with DVI-HDMI i get the black box around it. However, I foudn that catalyst 5.6 and earlier ddoesn't have this problem, nor did I get this problem connecting with VGA. My solution was to buy a TV with DVI input on it, but I suppose you could switch to VGA if your TV allows it. Otherwise you could try going back to 5.6 drivers.
 

paulpod

Distinguished
Jun 13, 2006
42
0
18,530
It may have been mentioned before but HDMI is a video input. As such, some insanely idiodic manufacturers apply overscan scaling to the signal.

This is an area that all reviewers need to cover since the specs for a display will never mention it. Not only do you lose information with overscan, you also suffer scaling artifacts on a signal that everyone expects to be pixel accurate (in the 1080p case).

There should always be a choice and the default should always be non-overscan. Defaults for any processing should be set to do the LEAST possible modification to a signal.

For example, the default settings for DVD players is to Letterbox. So 95% of all people hooking a DVD player up to an 16:9 display end up with a "double letterbox" picture and a loss of 25% of the scan lines.
 

MCMONOPOLY

Distinguished
Jul 4, 2006
268
0
18,780
Not to burst your bublle, but both statements you posted are not quite exact;

-The only TV i mentionned to not have this problem was the 37" Westinhouse you own, so the argument doesn't stand. Also if you made any research about LCD Panel makers, you would've noticed that only 2 or 3 companies come up since there's only that much PANEL makers in the world, all LCD TV/monitor Makers buy their panels off of them. So in light of that , you can make the connection between your 37" from Westinghouse and your friends 37" Scepter TV as to be the same "TV" appart form the Brand sticker and exterior finish. (Can't really consider it a TV, more of a monitor, since there's no included Tuner whatsoever, unless Scepter included one, but i'm not familiar with that brand and as for the Westinghouse model I'm pretty sure there's non included).

-Secondly: the OP said he owned a 7300 series Nvidia based GFX card, so going back to Catalyst 5.6 will not help any will it?

So please read the entire post before posting back with BS.

Also, to make things clear, the Westinghouse model I'm refering to (Westinghouse LVM-37w1) doesn't even have any HDMI connectors on it, or if you own the only other model they make with those stats (Westinghouse LVM-37w3), you ain't got no luck there either since the only HDMI connector availlable is HDCP protected, which won't let you use a DVI->HDMI cable in the first place, so what's your point?
 

mackerjack

Distinguished
Apr 23, 2007
2
0
18,510
I'm actually using a vga connection with my tv at the moment because it gives the clearest picture. I do have a dvi - hdmi cable and had the same problem as the OP, but i found that using different refresh rates the resolution would change so that i could no longer see the taskbar.
So basically, at a resolution of 1920 * 1080 @30hz the picture was fine, but changing the refresh rate to 25hz and the screen size streched so that i could no longer see the taskbar. Likewise, 1280 * 720 @60hz is fine, but changing the refresh rate to 50 hz and the taskbar disapered again.
 

thatonekid

Distinguished
Apr 1, 2007
8
0
18,510
Wow. Lots of information, some of it over my head. I am currently connected using vga and am unsure of the resolution. If I switch my Samsung tv over to HDMI in insted of PC input (vga) the picture looks poopy. That is when I switched over to the Tv recomended resolution of 1366 x 768 and it looks great but is still cut off. Just to clarify, the picture looks like it is zoomed in. All other resolution setttings act the same way they just are not as crisp. I have also tired 50 and 60hz settings with no luck.

As Mcmonopoly said, he has tried and failed on his samsung model as well.
 

MrCommunistGen

Distinguished
Jun 7, 2005
1,042
0
19,310
I recently bought a DVI/HDMI cable for my Sony 32" HD TV (its a CRT) and it mostly works fine for me with my 7800GT. The 720p setting comes up in the "TV" section in the old style Nvidia control panel. It might be called the nView panel, but if I remember correctly it renames itself if you only have a TV hooked up. I don't know exactly where it is since I'm at work, and I'm not sure where it is in their new CP (assuming its there at all) because I don't use it. It should be a rather long pulldown menu which sets itself to NTSC by default if you are in America or PAL (most other places). From here you should be able to select 720p. I'm pretty sure there wasn't a 768p setting since this isnt really a high def standard resolution. Also, there should be a way to set the output to "underscan" the picture which essentially zooms it out a bit so that the borders aren't cut off. This should also be adjustable.

I too have been having problems getting good usage out of my TV, although that is because its a 4:3 aspect ratio. If I run 480p the picture is pretty good (better than S-Video for sure). However when I try to run 720p or 1080i the output has black bars across the top and bottom of the screen. This is fine for watching widescreen content... in fact the difference between that and 480p is amazing even for DVD content. BUT if I try to watch anything fullscreen while in a HD resolution, the picture becomes a small square in the middle of the TV. It's noticeably sharper than at 480p, but its TINY. Sadly I haven't been able to adjust the settings enough to get any higher resolution to display in a 4:3 aspect ratio.

-mcg
 

MCMONOPOLY

Distinguished
Jul 4, 2006
268
0
18,780
Well then my friend you're a lucky guy, because as you pointed out earlier in or your posts, when your TV/monitor supports natively either 1280x720 or 1900x1080, your basically using the same resolution as 720p or 1080p, and also since nobody mentioned it, HDMI is exactly the same (video signal-wise) as DVI, but the HDMI cable can also be used to transfert HD Audio, which is supposed to make connections easier between A/V components. But Just keep in mind that you do have a DVI that supports 1900x1080 so i'd suggest using that instaed of the vag connector.
 

killermedic

Distinguished
Dec 26, 2006
12
0
18,510
I had the same problem on my Westinghouse 27" LCD-TV and X1900XTX w/ catalyst 7.2. I fixed it by UPPING the refresh from 60 to 75. At 60hz, I lost the outer 5% on all four sides. I upped it to 75hz and all was fine. Don't know why this works....It Just Does. Hope this helps!

Chris
 

thatonekid

Distinguished
Apr 1, 2007
8
0
18,510
I found the 720p and 1080i settings and tried both... no luck, closer but not perfect. I also tired the available refresh rates of 59 and 60hz with no change to the picture. I was not able to find the underscan feature, can anyone direct me to that?
 

MrCommunistGen

Distinguished
Jun 7, 2005
1,042
0
19,310
I'll see about getting some screenshots... I can't do it right now because the menus only show up when I'm hooked up to my TV and I'm not at the moment. If not tonight I hope to get it by tomorrow afternoon.

-mcg
 
Status
Not open for further replies.