and the image does not fill the entire screen and looks scaled (blurry).
Is there any way to force the ATI/AMD drivers to output exactly the resolution/mode set during the windows loading screen (and not some HDTV junk)?
The Catalyst classifies resolutions into two groups - basic and HDTV. The resolutions from basic show correctly (e.g. 1680x1050@60Hz) however, there's no basic 1920x1080@60Hz resolution listed. I guess this would achieve the mode used during Windows 7 startup.
I don't want to switch to a VGA cable (I haven't tried but I guess it would solve the issue).
Edit: I apologize, I probably picked a wrong forum (I remember choosing AMD Radeon), could someone move it there or into graphics cards? Thx.
I tried multiple versions of the drivers (incl. the latest, downloaded yesterday).
The problem is, I don't want to use any of the HDTV modes/resolutions. I want a plain 1920x1080@60Hz, the same that would be used in case the display was connected via DVI (only there's no DVI out on the laptop).
The display should be a regular PC monitor with TV functionality added on top and it should handle the standard PC signals well (it does). It is the weird HDTV (non-PC) mode/signal ATI chooses for 1920x1080 which causes the problems.
I would need to trick the drivers to think it is connected via DVI, not HDMI. I think that would do what I want.
I noticed there's some underscan/overscan crap specific to this HDTV/HDMI mode. I tried playing around the ATI's and the monitor's settings for this but haven't managed to get a clean picture.
It seems the display, once it detects this 1080p (HDTV) mode, does a bit of overscan. The slider in the ATI driver seems too coarse to hit the correct underscan value to perfectly compensate it. Setting the ATI's overscan to the rightmost position ends up in a too big picture which does not fit the screen. When I switch the monitor from normal mode to underscan, the picture almost fits but is still blurred (resampled).
This is the first time I'm using HDMI and until now, I thought we finally got rid of all those analogue controls for centering/zooming/shrinking the picture typical for CRTs. Yet someone had to come up with similar kind of crap for a digital interface.
The only problem I have is that I can't get a perfect 1:1 pixel mapping (i.e. sharp picture with no resampling) under 1920x1080. Everything else works as expected (1280x720 is upsampled by the display and the picture is smeared, like any non-native resolution).
Already tried that (as I wrote above). It seems the display's overscan value is somewhere in the middle and the driver's slider is too coarse to set the correct value.
The display itself has the following settings for it: full (moderate overscan), 4:3, zoom 1, zoom 2, underscan (seems different from "just scan" by a small bit)
I am having a similar problem myself. I have an HP notebook with an intel core i5-4210M with the intel hd graphics. I am trying to plug the hdmi out from my notebook into my samsung 1080p hdtv/monitor.
When i do this, I get a scaled picture (the start bar would be half way out of the bottom of the screen - the picture was too big) and the text and everything would not look clear like it should. When I connect to my monitor through the vga output and input however, the picture would look exactly like it should.
I have played with some settings on my monitor to get the picture too look acceptable over hdmi (reducing sharpness, and setting the picture size to "screen fit" rather than 16:9 to fix the scaling issue), but I still get better results over vga.
I use the monitor on my desk to connect to my laptop when i'm at my desk so the little details that are not exactly right over hdmi are a big issue to me. I want to use hdmi purely for the capability to carry audio and video together so I don't need to take up my headphone jack and it's simpler to use only one cable. Also, my monitor has speakers built in to it that sound better than my laptop's.
To recap, the only thing that I have been able to do to make it look semi acceptable over hdmi is adjust the settings on the monitor itself, but again outputting to vga is perfect, and I don't know why there is a difference.
Strange, there has to be something more obvious going on here. Assuming its software related might be going too far. I think it may be something wrong/not compatible with the laptop or your tv is not full hd.
I have both an hp pavilion laptop (vista) with an ati 4650 & pc (win7) with an ati 5870 both connected via hdmi to a Panasonic Viera plasma and both display flawlessly @ 1080p after setting the scan option in tv.
There’s no monkeying around with the driver peripherals, it just works. It should work for you too. Like I said it shouldn’t be this difficult, there has to be something else going on here