Sign in with
Sign up | Sign in
Your question

Why is 1080p washed out?

Last response: in Graphics & Displays
Share
January 21, 2012 5:37:32 PM

I have a samsung ln52a650 hdtv and a new 7970. I am trying to play saints row 3 and 720p is really detailed and crisp yet when I play in any other resolution the screen looks cloud/foggy like the brightness is too high or something. Even my desktop looks like this at 1080p as well all washed out looking

Has anyone seen this before on their setup? I have my tv input set to pc but cant seem to get better detail at 1080p like I do at 720p.

Is there any kind of instructions on how to best setup these amd cards for use on an hdtv in 1080p?

More about : 1080p washed

January 21, 2012 6:24:19 PM

I would guess the native res on your scren is 720 most screens not in their native res look odd and hazzy as you describe.

Thent
m
0
l
Related resources
January 21, 2012 6:59:28 PM

Are you connected with a HDMI or DVI cable? One of those is required for 1080p.
m
0
l
January 21, 2012 7:20:26 PM

I don't think so. VGA can support resolutions of 1080p and some higher.
m
0
l
January 21, 2012 8:03:41 PM

I am hdmi. I disabled the pc setting and the games are no longer washed out but the desktop looks terrible.

If anyone is playing through an hdtv with an amd card how did you set it up?
m
0
l
January 21, 2012 8:05:40 PM

wtd03 said:
I am hdmi. I disabled the pc setting and the games are no longer washed out but the desktop looks terrible.

If anyone is playing through an hdtv with an amd card how did you set it up?

The reason is because of the hdmi cable..
m
0
l
January 21, 2012 10:05:42 PM

amuffin said:
The reason is because of the hdmi cable..


What do you mean? HDMI is newer than DVI-I and offers better image quality, and higher bandwidth...
m
0
l
January 21, 2012 10:19:44 PM

By using an PC with an HDTV, you NEED to enable an PC setting to disable post processing and Telecine Pulldown... Otherwise, pictures and games would look like garbage...

Hope it helps! :) 
m
0
l
January 21, 2012 10:23:12 PM

bloc97 said:
offers better image quality, and higher bandwidth...


you're wrong there ;) 
m
0
l
January 21, 2012 10:35:04 PM

Sunius said:
you're wrong there ;) 


HDMI is from 2003 and offers 10.2 Gbit/s (we are talking about single-channel here since HDMI only has 1 channel)
DVI is from 1999 and offers 3.96 Gbit/s per channel, and only 7.92 Gbit/s with 2 channel.

HDMI Max res = 4096 × 2160
DVI Max res = 1920 × 1200 (Single Channel)

Don't forget that 7.1 Surround Audio on the HDMI also eats a lot of its Bandwidth...

But DisplayPort wins over all! :) 
m
0
l
January 21, 2012 10:35:27 PM

So overall HDMI is faster...
m
0
l
January 21, 2012 11:13:25 PM

Nope not really. VGA and DVI produce better quality video than HDMI.
m
0
l
January 21, 2012 11:17:39 PM

amuffin said:
Nope not really. VGA and DVI produce better quality video than HDMI.


Are you kidding?!?!? So now you're trying to say that Analogue signals are better than digital ones? So why doesn't everyone go back to the old Composite cable?
There is no way that DVI produces better quality video than HDMI, you don't have any proof on that.
m
0
l
January 21, 2012 11:24:03 PM

bloc97 said:
Are you kidding?!?!? So now you're trying to say that Analogue signals are better than digital ones? So why doesn't everyone go back to the old Composite cable?
There is no way that DVI produces better quality video than HDMI, you don't have any proof on that.

Especially with radeon cards, dvi is digital, with hdmi everything is blurry and with dvi or vga everything is crystal clear.
m
0
l
January 21, 2012 11:29:43 PM

amuffin said:
Especially with radeon cards, dvi is digital, with hdmi everything is blurry and with dvi or vga everything is crystal clear.



I am going to have to ask for proof on this, I have seen all 3 and I can't really tell the difference between them.
m
0
l
January 21, 2012 11:34:07 PM

amuffin said:
Especially with radeon cards, dvi is digital, with hdmi everything is blurry and with dvi or vga everything is crystal clear.


Haha... I get it, you have the same old problem like others with the "Underscan" option... If HDMI makes black bars around your screen and look really blurry, go to CCC with advanced view, then "My Digital Flat-Panels", "Scaling Options", and finally slide the bar all up to the right.

Lol, I can't belive that you really think that VGA is better than HDMI, then why you think that VGA needs an Auto-Sync feature?
m
0
l
January 21, 2012 11:59:08 PM

Ok enough of the complaining.

First off

HDMI look NO better then DVI and at first they used the EXACT same signalling. HDMI was the home theater answer to requiring DVI + digital or optical audio.

So you have DVI video + digital audio = HDMI

As newer revisions came out they added things(network for one), but in general HDMI is based on the very core technology of DVI.

Many HDMI devices had a limit of 1080P resolution. This has been corrected with different revisions.

Now onto the why DVI may work better for a TV.

I see it listed already, TV's overscan the HDMI input, while many will not do the same to the DVI input.

DVI will not bring audio, but for some people works out better.

It is a valid reason to "Try" DVI (or VGA) + Audio if you have it

Ohh yeah and VGA will do 1080P just fine :) . Please note VGA lacks the copy protection needed for bluray playback and you have to get creative to use it.

Damn that was all off topic.

OP, Is overscan an issue for you? Or is it just washed out looking but not falling off the edge of the screen?
m
0
l
January 22, 2012 12:15:09 AM

^ Thank You. Neither one of the two is "better" than the other.

Now that that's settled, nukemaster's advice is definitely what I'd try first. If your TV has a DVI input, go buy a cable (or not, if you already have one) and try it through that. If not, VGA is your next best bet.
m
0
l
January 22, 2012 12:44:58 AM

Overscan is what i was wondering about as well. Every TV has its menu's arranged differently so I can't tell you exactly how to disable it, but somewhere in the video options there should be the ability to change the display format. It's likely listed as aspect ratio or similar.

On my TV the option is called "Just Scan", but yours is likely named differently. That is the first thing I would try though.

HDMI or DVI should make 0 difference. Electronically the two are 100% compatible, so there isn't any change in the signal what-so-ever. You can actually get cables that have a DVI on one end, and HDMI on the other end even. (I actually use a couple of these) HDCP for Blu-Ray even works over DVI > HDMI cables just as if it were straight DVI or straight HDMI.
m
0
l
January 22, 2012 12:54:49 AM

I looked all over my Samsung monitor(245T) and never did have an option to disable over scan(But not i am gonna go look again). The video card options let me see the start menu, but it was just not sharp. It would be fine for watching TV, but not computer use.

It was a shame because I also got one of those DVI->HDMI cables just to get everything all fuzzy(not the cables fault, just my screen) :( . Ohh well, just went back to VGA(I wanted to use HDMI to free up VGA for other stuff and DVI was in use already).
m
0
l
January 22, 2012 1:29:11 AM

That's a PC monitor nuke so it doesn't have overscan at all. His is a TV so it does.

From the Cnet review site, in the spoiler tags it explains where to find the option to disable overscan for your television wtd03;

Spoiler
The LN52A650 has three adjustable picture modes that are each independent per input. That's great, but in addition there are three more picture presets, called "Entertainment Modes," that cannot be adjusted and are accessible via a separate key on the remote and the Setup menu. This arrangement is unnecessarily confusing on a TV with so many settings anyway; we'd prefer to have all of the picture modes, both adjustable and non-adjustable, be accessible together from a single key on the remote and one area of the Picture menu. Also, if you're in Entertainment mode, you're prevented from making picture adjustments, or even selecting one of the adjustable picture modes, until you actively cancel an Entertainment mode by navigating to the Setup menu (which the onscreen instructions suggest) or toggling the mode to "Off" using the remote. That's an awkward hitch in an otherwise smooth menu design.

Not every submenu gets the updated graphics, however, including the important (and still perfectly functional) white balance controls.
Others picture controls include five color temperature presets along with the ability to fine-tune color using the white balance menu; three varieties of noise reduction, including an automatic setting; a film mode to engage 2:3 pulldown (it also works with 1080i sources); a seven-position gamma control that affects the TV's progression from dark to light; a dynamic contrast control that adjusts the picture on the fly; a "black adjust" control that affects shadow detail; and a new color space control that lets you tweak the Samsung's color gamut.
You can choose from four aspect ratio modes for HD sources, two of which allow you to move the whole image across the screen horizontally and/or vertically. As we'd expect from a 1080p TV, one of those modes, called Just Scan, lets the LN52A650 scale 1080i and 1080p sources directly to the panel's pixels with no overscan--the best option unless you see interference along the edge of the screen, as can be the case with some channels or programs. There are also four modes available with standard-def sources.


Overscan is something TV's do by default to cut out the dead area that some channels had around the very edges of their broadcast. (Much less common now than it was when HDTV was new) You'd often see a thin line of gray fuzz alone one or more sides of the screen on many channels in the past if overscan was disabled.

The side effect however, is that you aren't displaying a true 1080p image. it's more like 1880x1040 instead of 1920x1080, as it cuts roughly 20ish pixels off every edge. This means that the center 1880x1040 or so image is stretched to fill the full 1920x1080 panel resolution, which of course is non-native and can cause blurriness.

With dedicated monitors however, they are made with PC's in mind which of course fill the full 1920x1080 signal, so they don't have this option.
m
0
l
January 22, 2012 1:50:23 AM

Yargnit said:
That's a PC monitor nuke so it doesn't have overscan at all. His is a TV so it does.

It sure does, it is a monitor as well as a tv(S-video,Composite,Component and HDMI inputs(even has audio out for HDMI in), everything but the tv tuner)
It also overscans my cable box. I am MORE then aware of overscan(Take HDMI -> DVI, no more overscan).
m
0
l
January 24, 2012 1:17:16 AM

First thing you want to do is get your monitor remote and go to:
Menu > Input > Edit Name > edit your HDMI so it says PC next to it

Second thing you want to do is enter your Catalyst Control Center and go to:
My Digital Flat-Panels > Scaling Options (Digital Flat-Panel) > turn the scaling to 0%

My Digital Flat-Panels > Pixel Format > select RGB 4:4:4 Pixel Format PC Standard (Full RGB)

You may want to try turning on Image Scaling if you have a problem with other resolutions letter boxing in:
My Digital Flat-Panels > Properties (Digital Flat-panel)
m
0
l
January 24, 2012 2:02:25 AM

amberale4 said:
First thing you want to do is get your monitor remote and go to:
Menu > Input > Edit Name > edit your HDMI so it says PC next to it

Second thing you want to do is enter your Catalyst Control Center and go to:
My Digital Flat-Panels > Scaling Options (Digital Flat-Panel) > turn the scaling to 0%

My Digital Flat-Panels > Pixel Format > select RGB 4:4:4 Pixel Format PC Standard (Full RGB)

You may want to try turning on Image Scaling if you have a problem with other resolutions letter boxing in:
My Digital Flat-Panels > Properties (Digital Flat-panel)

If that was for me?

I just tried it

If I set scaling to 0% I have lots of the screen missing. I think it is just my screen(Not my cards fault). Either way, thanks for the suggestions. For movies, it would not even be much of an issue, just the way text looks.

For the hell of it, i am going to try Intel GMA 3000 :) 

EDIT

Intel GMA, same thing, it is not too bad for video or even some games(I was playing some GMA Just Cause 2), but text is just awful.
m
0
l
January 24, 2012 11:30:08 PM

OMG!!! Just Cause 2 on GMA?!?! It is happening, the world is going to end! AHH... (well only for AMD).
m
0
l
January 25, 2012 12:33:21 AM

bloc97 said:
OMG!!! Just Cause 2 on GMA?!?! It is happening, the world is going to end! AHH... (well only for AMD).

yeah but only 720p and with med/low settings.

Very impressed for such a light weight card.

Then I switched one virtue :)  Just cause 2 used to not work with it, seems to now.

I honestly want to bench the GMA 3000 vs 4350. So far the GMA handles media very well.
m
0
l
January 25, 2012 12:43:52 AM

Try to run Battlefield Bad Company 2 on an GMA, if it works even with the lowest settings @ 400x300 at somewhat stable fps (~20), I would be impressed...
Even my HD 4250 can only play at Lowest Settings @ 1024x768 in BFBC2.
m
0
l
January 25, 2012 10:31:51 PM

Ahhh. You ment HD 3000? I though it was the good old GMA x3000...
m
0
l
January 26, 2012 12:17:40 AM

bloc97 said:
Ahhh. You ment HD 3000? I though it was the good old GMA x3000...

I dont think i would every be impressed with the old 3000 :p 

Guess i should be calling it HD 3000.

i will edit
m
0
l
January 26, 2012 1:43:36 AM

nukemaster said:
I dont think i would every be impressed with the old 3000 :p 

Guess i should be calling it HD 3000.

i will edit


hahah, I thought you meant the gma 3000 also XD

I was starting to wonder what was going on :3
m
0
l
October 17, 2013 5:23:20 PM

bloc97 said:
amuffin said:
Especially with radeon cards, dvi is digital, with hdmi everything is blurry and with dvi or vga everything is crystal clear.


Haha... I get it, you have the same old problem like others with the "Underscan" option... If HDMI makes black bars around your screen and look really blurry, go to CCC with advanced view, then "My Digital Flat-Panels", "Scaling Options", and finally slide the bar all up to the right.

Lol, I can't belive that you really think that VGA is better than HDMI, then why you think that VGA needs an Auto-Sync feature?


So Awesome! I just got an R9 780x and had this problem with a brand new ASUS 23.8" 1080p ips monitor (that worked perfectly with an old 8800GTS 512) and was about to pull my hair out! I used straight HDMI, and DVI-to-HDMI with the same issue. Thanks for the solution! Now both of my monitors have impeccable desktops and always look the the same. {for future G searches: Radeon, Text, Blurry, AMD, suddenly, one of my monitors is blurry,}
m
0
l
!