Resolution Auto Switching

2koolpa

Distinguished
Sep 26, 2011
7
0
18,510
Hello,

I have a Gigabyte ATI Radeon HD 5750 and im having some resolution problems when connecting it via HDMI to my 40 inch Sony Bravia TV. My desktop resolution is at 1280x720. Looking at my display property settings that is the highest available resolution. However i can use Catalyst Control Center and set my resolution all the way up to 1920x1080. The problem is that when i open a game, lets say bad company 2 for example, (but it happens with every game i have)the screen automatically changes back to 1280x760. Going into game settings this is the highest resolution i can use. Exit the game and my desktop has switched back to 1280x760 and id have to use CCC again to make it go back up. When i change it in CCC i can look in display property settings and see the current resolution (I.E. 1920x1080) but when it auto switches back after opening a game the highest option available is again only 1280x760. I can switch over to my VGA cable and do not have this problem at all. i can play all my games in 1920x1080 without it ever auto switching my desktop. Why wont it work properly using HDMI then? thanks for your help.
 

2koolpa

Distinguished
Sep 26, 2011
7
0
18,510
the problem is that its NOT my TVs max resolution as i can change it all the way up to 1920x1080. but if i right click desktop and go to display properties and then settings it does list 1280x760 as the max. i can change it to 1920x1080 through my catalyst control center. when i do that, 19200x1080 is now the highest option in display properties. which is what i want... BUT then when i start a game (some other programs make this happen too) is automatically changes to 1280x760 in the game, and upon exiting the game my desktop has reverted back to 1280x760 as well, and under display properties settings 1920x1080 is no longer available. the highest one is BACK to 1280x760. i have to go back into CCC to re do everything.

using VGA though i can choose 1920x1080 through display properties, dont have to use CCC. and it never auto switches back to a lower resolution when running a game, i can play the game in 1920x1080 and my desktop stays that way too.
 
So your saying you can play 1920x1080 on your 1280x760 monitor? I'm not sure how that's even possible.

My monitors max resolution is 1680x1050. There is no way I can change it to 1920x1080 under CCC or display properties.

Maybe I'm still not understanding your issue.
 

2koolpa

Distinguished
Sep 26, 2011
7
0
18,510
yeah i mean i dont see how 1280x760 could be my max resolution. its a sony bravia TV 1080p. 1080p is 1920x1080. 720p is 1280x760. when i do switch it to 1920x1080 through CCC my tv says its in 1080p. and like i said when i use a VGA cable there are many resolutions to choose from even in display properties, leading all the way up to 1920x1080. my problem is that i want my HDMI cable to do the 1920x1080. which is does, but only through CCC. and then opening games or other applications auto switches it back to 1280x760. i do appreciate you helping me.
 

nordlead

Distinguished
Aug 3, 2011
692
0
19,060
@geekapproved

To sum up in bullets.

* His TV is 1920x1080
* His computer (Control Panel -> Display) thinks the max resolution is 1280x760 via HDMI.
* CCC will let him set the computer to 1920x1080
* The games go back to what the computer thinks is the correct resolution (1280x760)
* Using VGA he doesn't have that problem.

I have an AMD HD 4200 integrated graphics in my HTPC and I have problems with my HDTV. It insists on thinking there is overscan (adds black bars around the screen), and changes the desktop resolution whenever I turn off the TV. Luckily it changes back when I turn the TV back on, and auto-starting CCC fixes the overscan issue. I almost think it has more to do with the TV, but I could be wrong. My theory is that the TV manufacturers aren't sending the correct information over the HDMI to the PC so it doesn't quite work right. I could be totally wrong and there could be an easy fix :-D
 
I'm aware its a SONY Bravia at 1920x1080.......I'm talking about the computer monitor.

upon exiting the game my desktop has reverted back to 1280x760 as well, and under display properties settings 1920x1080 is no longer available

I'm if the computer monitor's resolution is 1280x760, how can it run at 1920x1080 at all is what I'm asking....oh brother nevermind, I'll let someone else decipher this.
 
Hi, I'm a tech. i think I understand your problem.

(don't flame me for the length. It's a very technical problem and I've covered laptops in here as well.)

First of all, when you use the VGA input this is a completely different video format than HDMI. It is treating your HDTV as a monitor which will enable you to choose ALL RESOLUTIONS from 640x480 (or 800x600) up to 1920x1080. HDMI-Video (normal HDMI input) can only use the resolution it is at so all games would have to be 1920x1080. There is an HDMI-PC which uses an internal adapter to treat your HDTV exactly like a monitor and is meant specifically for gamers with computers who need to change resolutions. Changing resolutions is very important for many gamers especially if you don't have a high-end graphics card.

The VGA input will also be grouped with an audio input, likely 3.5mm STEREO so you can run a separate 3.5mm Male->Male cable from your onboard audio or audio card and also have audio.

HDMI is tricky with audio and often the graphics card only outputs movie audio, not Windows sounds or Game audio.

LAPTOPS with HDMI will work just fine, but my dad's laptop required that I change the audio output in software from my laptop speakers to the HDMI output or else I got video on the HDTV but audio on the laptop screen. This is done by: "right-click the audio icon in the System Tray (lower right)"-> "playback devices" -> (change from "speakers" to something like "Realtek HDMI output" or whatever). You'll need to REPEAT this process and toggle back to "speakers" to get your laptop monitors working again. (couldn't they just have it so it automatically switched to HDMI audio when you insert the cable?)

HDMI:
When you choose your output resolution for HDMI on an HDTV you want to choose it specifically for HDTV and it should be 1080p@60 (1920x1080 progressive at 60Hz). Overseas PAL format is the same but "50" instead of 60. CCC has an option to ADD this to your normal Windows resolution options (there are no HDTV options there normally so choosing 1920x1080 there would be incorrect and screw things up for the HDTV).

I hooked up my dads HDTV (via HDMI) successfully with the HDTV as the "extended" monitor. However, if you set up both screens as CLONED screens you'll have issues if the resolutions do not match for both screens.

If you use an HDTV it might be best to temporarily set the HDTV as the MAIN monitor and disable your computer completely as a second monitor but that's something you'll need to experiment with.

Summary:
If you continue to have problems, use the VGA+3.5mm audio instead. You SHOULD be able to get HDMI working. You may see performance drops in games with two screens running so use just the HDTV as the main screen and disable the computer screen output.

(If you have audio working you either have a TV that supports PC-HCMI input, onboard graphics or you have a LAPTOP. Laptops, unlike desktops with HDMI outputs just work properly with sound whereas desktops just have dedicated audio decoders for Movie audio like AC3. AFAIK, none of the addon graphics cards allow game sounds through their HDMI outputs. So if you have a DESKTOP with an ADDON GRAPHICS CARD and wish to game on an HDTV you'll either need to use the PC input if it has one or use HDMI for video but run audio through a RECEIVER from your onboard audio or addon audio card.)

*Sorry to get long-winded but this problem is actually pretty technical.
 

2koolpa

Distinguished
Sep 26, 2011
7
0
18,510
i appreciate the long response but.. im not sure what that has to do with my problem? sorry if im misunderstanding but nordlead put it correctly.

To sum up in bullets.

* His TV is 1920x1080
* His computer (Control Panel -> Display) thinks the max resolution is 1280x760 via HDMI.
* CCC will let him set the computer to 1920x1080
* The games go back to what the computer thinks is the correct resolution (1280x760)
* Using VGA he doesn't have that problem.


That ^ is my problem...
 
Let me have you try one simple thing:

1) CCC-> "My Digital Flat Panels" -> "HDTV Support"

2) check "add 1080p60 format... NTSC"

3) go to "Display Properties" and enable ONLY your HDTV as the main screen (your computer monitor should be off completely)

4) right-click the DESKTOP-> "screen resolutions" and choose the one you added ("1080p60 NTSC")

This should work. You MAY require some scaling but I doubt it.

*You don't need to go to your desktop. You can change the resolution properly in CCC. However, I don't have an HDTV hooked up so my HDTV options aren't accessible. When you have the HDTV as the only screen OR it's the one highlighted if you have both screens selected your Display Properties should show "1080p60 NTSC" as an option as well as "720p60 NTSC" and several other ones ("25" and "50" are PAL formats); a North American HDTV is usually 1080p60 NTSC. You always choose the highest resolution your TV supports (1080p60 NTSC). If the screen resolutions aren't in Display Properties for your TV then they should be under the "HDTV support" section.

If this doesn't work I'll help more.

I also suggest you print out my earlier (long) response as some of it may make more sense to you later on.
 
720p60 NTSC:

This is another option. It will output 1280x720 instead of 1920x1080. Your graphics card may not be good enough for 1920x1080 to keep frame rates high enough.

However, for BluRay you'd want to be using 1920x1080. Compressed BluRay would probably look the same at either resolution.

VGA + 3.5mm:
Also, I believe you said you could use the VGA input? This might be your best option. It would work EXACTLY like a monitor. Again, disable your second monitor or games won't run as well.

You can get a 3.5mm M->M audio cable from Monoprice for $4.

*BluRay is not supported through VGA.
 

2koolpa

Distinguished
Sep 26, 2011
7
0
18,510
adding the hdtv support through CCC doesnt to seem work. i add it but it never shows up in my desktop screen resolutions. what do you mean by enable only my hdtv as the main screen? it is the only screen hooked up there are no other screens.
 
I assumed you were using a desktop computer with its own screen and adding the HDTV as a second screen.

I'm still confused as to your problem. Can you answer these questions?
1. Is the HDMI input you are using a normal HDTV input, or is one of the special HDMI-PC inputs designed for computers?
(Your manual will tell you. If it's PC-HDMI it won't work with a BluRay or HDMI DVD player only a computer)

2. What resolution options do you have in the Catalyst Control Panel?
a) 640x480... up to 1920x1080, OR
b) 720p60 NTSC... 1080p60... etc
(Desktop Management->Desktop Properties)

3. What is the "Panel Information" listed?
(My digital flat panels->Properties)

4. What is the exact model of your HDTV?
(i.e. Sony KDL40BX420)

Audio:
If this is an addon sound card and NOT integrated graphics you should not get game sounds with HDMI. You'd need a separate receiver or you'd have to use the option of VGA graphics + 3.5mm audio. Either way you'd be using your onboard or addon sound card as HDMI audio only works for specific movie audio codecs like Dolby Digital (AC3) and a few others.
 
Update:
Try using this option and see if it makes a difference (if you can access it with HDMI):

"enable GPU scaling"

(My Digital Flat Panels-> Properties)

*I had a GLITCH where I couldn't enable this unless I changed my screen resolution. I did so, enabled it, then changed my screen resolution back and it stayed.
 

2koolpa

Distinguished
Sep 26, 2011
7
0
18,510
#1 - its a normal hdmi input

#2 - Catalyst control center tells me i can go from 640x480 all the way up to 1920x1080 with many options in between. Right clicking my DESKTOP (nothing to do with CCC) clicking properties, going to settings tab the maximum resolution i can select is 1280x720.

#3 - panel information = Display Name SONY TV, Maximum reported resolution 1920x1080, maximum reported refresh rate 60 hz

#4 - sony kdl-40s4100

I have no problem with audio, i never mentioned anything about audio.

enabling GPU scaling does nothing, it still does the same thing... which brings me to trying to explain what the problem is. maybe im not explaining this very well, but im going to try again.

Point #1 - Windows tells me the max resolution i can set is 1280x720. by right clicking my desktop and selecting properties and looking at the settings tab, that is the highest setting i can choose.

Point #2 - BUT my catalyst control center program that comes with my AMD video card will allow me to set my screen resolution all the way up to 1920x1080. i like it that way so i choose it. now my whole computer is looking great in 1920x1080. fantastic, this is great... but then.....

Point #3 - I open a video game to play in all the glory of 1920x1080 and im very excited to see the game in such high definition. when the game starts, my screen automatically reverts to 1280x720 without me changing ANYTHING.

Point #4 - i now exit the game and my desktop is BACK to 1280x720, yet before starting the game, my desktop was at 1920x1080, and i NEVER changed it back to the lower setting.

Why is it changing? im not changing it. windows is automatically changing it back DOWN to what IT thinks is the highest resolution i can use. windows is wrong since CCC lets me use a higher one. as long as i dont open any full screen applications it will stay in 1920x1080. as soon as i do though, windows demotes me back down.

im sorry if im not explaining this well, but that seems to be the best way i can put it. also nordleads bullet points were dead on.
 
I'm not sure what's happening. It should work fine.

I doubt it would matter, but you could try all three HDMI inputs.

To be clear, the video signal you should be choosing is:

1080p60(NTSC)

Never use the desktop resolutions, they are for normal monitors, or if you use the VGA input which then makes your TV a monitor. You absolutely MUST choose the format 1080p60NTSC and NOT simply "1920x1080" on your desktop.

It sounds like some sort of glitch.
 

2koolpa

Distinguished
Sep 26, 2011
7
0
18,510
yeah man it even happens if i choose "1080p60 NTSC" specifically. and ive tried all 3 hdmi ports but its all the same. i appreciate you trying to help me, i just have no idea why this keeps happening.
 
You could try using the VGA video + 3.5mm stereo audio option. See if the VGA-PC option supports 1920x1080.

This option will treat your HDTV as a computer monitor allowing you to change to any resolution from 640x480 up to 1920x1080. There's a little circuit board in there that handles all scaling then hands the 1920x1080 video signal to the rest of the HDTV.

Quality should look the same as DVI/HDMI. The only drawback is VGA doesn't support HDCP which really only affects BluRay playback from your PC.

(I still don't understand how you are getting AUDIO to your television for games. It's working through the HDMI cable for a regular desktop computer?)
 

kevin324

Reputable
Sep 21, 2014
1
0
4,510
i also have a problem with my desktop not recognizing the speakers on the tv and keeps changing the resolution everytime it powers on. my computer is a windows 7, graphics card is a NVIDIA geforce 9400, HDMI cable connect. i've tried uninstalling and reinstalling the drivers of the graphics card, tired to re-configure simulated resolution settings. but i noticed that when i turn on the computer first and then plug into the monitor it won't change resolution?
 

Dono N

Reputable
Sep 25, 2014
1
0
4,510
Yea, I have been battling this for a long time. For me it all started when I upgraded from vista ult 64 to win 7 ult 64. I have been using my desktop monitor and adding on my led tv when I want to watch stuff on big screen with the family. As soon as I went to 7 windows decided I wasn't smart enough to setup and run my monitors the way I wanted. It kept switching things, turning off my desktop monitor and making my TV my main, then it quit detecting the desktop monitor altogether. I would have to jump through lots of hoops to fix it then 5 minutes later it would auto-detect and do it to me all over again. So after all the research and posing on tech forums and windows support it turns out it is just windows 7 and its auto-detecting monitors, apparently they don't see it as a problem since they want everyone to switch to win8 now.

Sadly an official fix isn't likely to come, our only hope is a patch from a 3rd party or a work around using the registry or driver.