Sign in with
Sign up | Sign in
Your question

Dual enabled monitors different resolution 4850 HD

Last response: in Graphics & Displays
Share
April 7, 2009 10:56:08 PM

I have two monitors each connected via DVI. One is 1680x1050 the other is 1920x1080. I want both active at the same time (cloned but using individual resolutions). Setting them cloned I get both active but both are 1680x1050. If I use extended mode, I can set the resolutions separately and that works but I can't get extended mode to have both display active similtaneously like cloning does. I just get wallpaper on the secondary display, but can hotkey to switch between them. Is there any way to have both displays active together, but with independent resolutions?
April 8, 2009 2:23:22 AM

Nope because cloning displays is exactly that, everything is identical even the resolution. Why clone them though? If your able to see both monitors why not utilize the real estate of both? And if your only able to see one monitor, why not just use the one then?
April 8, 2009 2:57:02 AM

I agree with boonality rochenyguy. You'll have to disable cloning, so that you can set the resolutions of each monitor individually. You've got two monitors man, maximize 'em! :) 
Related resources
April 9, 2009 9:20:04 AM

r_manic said:
I agree with boonality rochenyguy. You'll have to disable cloning, so that you can set the resolutions of each monitor individually. You've got two monitors man, maximize 'em! :) 


The two monitors are in separate rooms. Main monitor is in the 'office', right next to the desktop (26" 1080I). The second monitor is a 46" Sony 1080P LCD TV in the living room connected via 35" DVI - HDMI cable through a wall. There's also a extension cable through the wall connected to a wireless mouse and keyboard in the room with the Sony TV/Monitor.

I use both monitors/setups as full function systems for accessing/using the desktop, playing videos, Word, Excel, outlook, etc. so I don't want to just extend certain apps to the second display. I'm doing it now as cloned but only get 1080i on the Sony. Sounds like my only solution is to replace the 26" 1080I with a 1080P capable alternative, or just live with both running in 1080I. Not a big deal to live with it although playing a 1080P video from the desktop to the living room Sony TV would be great. Since I do use the second monitor to access all apps, not just playing video, extending just the video apps to the second monitor really doesn't do what I need.

Thanks for the feedback.
April 13, 2009 1:40:15 AM

Ahhh... I assume you use a wireless mouse and keyboard to control everything? The only thing I can think of is to not clone (so that both displays remain resolution independent). Then, you can move your video player's window so that appears on the Sony TV, then maximize it.

My experience is that windows that are maximized fill out the display where they're "mostly" in. I know you'll probably be left with a title bar and status bar, so I hope someone else here has a better solution.
September 4, 2009 11:23:32 AM

I have the sam eproblem.
One bit LCD TV screen that I want to use as my main monitor to its full resolution(1920), but the mx resolution on the laptop is (1366).

Is there no utility to do this? Like a small program that can transfer the desktop to the big screen with the different resolution? :( 
September 7, 2009 1:05:37 PM

the only way I can think of getting around your problem if you NEED it cloned is to put an extension between the monitor and video card. (Might even have to change it to analog) This will keep windows from detecting the type of display, and should allow you to set a higher resolution (of course this means that the resolution is going to be higher than the monitor, so you will have to scroll) But if you set it to 1920*1080, then cloned it to the tv, you should get the results you want. As everyone else said, your best bet would be forget cloning. Extend desktop, and then when you know you're going to the living roo, just put change that to your default monitor.
September 7, 2009 2:46:05 PM

Well, the way I solved the problem was that on my laptop(nowadays on most laptops) there is combination of buttons: Fn+F4(on my laptop) and it changes modes.
It means that the videocard changes from one monitor to the other, in other words it enables one of the monitors while disabling the other one, or enables them both.
This gave me the oportunity change the resolution on my second monitor(HD TV) to 1920x1080.
Now I use my hd tv as my monitor.

peace
Anonymous
a b C Monitor
October 29, 2009 7:07:40 PM

This is so disturbing... I've got the same issue but didn't have it with my old graphicscard. I could clone the picture from my monitor to the tv and my monitor would be 1680x1050 and my tv 640x480 or whatever the tv can produce as best. I was so happy with that working and since then I've had a Radeon 4850 card wich failed to accomplish that and now a nVidia GTX280 WICH ALSO FAILS! If this is intended, if it's not a bug, I'm considering switching back to my old graphicscard or whatever. I can't belive they'd take such a huge leap backwards in the evolution
Anonymous
a b C Monitor
January 22, 2010 1:25:38 AM

Yeah, I'm having a similar issue and it is driving me crazy. Could clone monitors just fine without changing resolutions on my old computer, but with this one it has to change my main monitor to a hideous resolution. Why is everybody so excited about the idea of extending the display onto another monitor?? The reason I want to clone displays is so I can watch movies and stuff like that on my TV, but otherwise have the TV off or set to a different input. I hate the extended display, it's just annoying because then every time I want to make a change or open something new, I have to get up, go back to my computer so I can see the other display, and do it there. Plus, I want to keep the two monitors connected so I'm not always fiddling with cables, but that doesn't mean I always want my TV on and set to PC input, but then when I try to open new windows, it opens them by default on the secondary monitor, which means I can't see them! It's irritating as hell, and what makes it all the more irritating as I know this need not be a problem since it worked fine on my last machine.

Not really looking for a fix here, cuz I've busted my butt looking and I don't think there is one. I'm really just venting :fou: 
February 12, 2010 1:17:14 AM

Try Ultramon or Multimon, and please let us know if it fixes the problem.
February 27, 2011 1:45:48 PM

OK, so I'm having the same problem, as I presume many are these days. Hard to imagine SOMEONE has not solved this yet.

Here's the issue, recapped a bit but in terms that I think make more technical sense than some outlines.

Monitor 1 is an HDTV capable of full HD - 1080p display.

Monitor 2 is a display with a maximum resolution something less than 1920 x 1080 - it will not display full 1080p. In my case, I'm working with an old-school TV on a scan converter, so in theory the max "resolution" is 640 x 480.

I want the same output on both screens. This is called "clone" or sometimes "mirror" mode. I do NOT want to "extend my desktop" to the second screen, because I want whatever is playing back on Monitor 1 to also be playing back on Monitor 2.

With every video card I've tried (ATI and Nvidia), if I "extend" my desktop to Monitor 2 I can then change the resolution of Monitor 2 to whatever I want, completely independently of the settings for Monitor 1. In this mode, however, whatever I'm showing on Monitor 1 does NOT show on Monitor 2. Sure, I can drag it over there, but it's "either/or", not simultaneous on both.

On the other hand, if I "clone" Monitor 1 to Monitor 2, I get exactly what I want - the same content on both screens - but since that content is being displayed at 1920 x 1080, only a small portion of the content is visible on Monitor 2... Not watchable.

I've tried Ultramon, it's great, it DOES what I want, however it is too resource intensive for the machine. It makes the video playback jerky and generally unwatchable for a normal TV or movie watching experience.

Secondary issue - just throwing this into the picture because it may be useful for someone else reading this with the same problem. Monitor 1 is connected via HDMI with the audio travelling on the same HDMI cable/interface. Monitor 2 is connected using VGA out going through a scan converter - which means NO audio is associated with it.

To get audio to the second monitor I use a program called VirtualCable, which allows me to send the same audio to two different audio interfaces - one being the HDMI audio, the second being the audio card built into the machine's motherboard. Only downside to that is a slight delay in audio, but I've found if I set that to 150ms in VirtualCable it's not noticeable to most people.

So... What I would LIKE to do is find a way to send the same desktop view (i.e. "clone" it) to two monitors, one with 1920 x 1080 resolution, the second with 640 x 480.

At the same time, as a side effect, if this was also possible in a way that sent the same AUDIO to both video interfaces that would be wonderful...

Options I can see...

1) A single video card with two monitor outs, that allow independent configuration of both outputs for resolution. If one output was HDMI and the other "something analog" (analog VGA, composite video, S, etc) that would be great for MY purposes, but just two HDMI out's would work if it had to - I can get an external HDMI to analog converter.

2) Two video cards that would then allow the same thing as above.

3) An external device that would allow me to take a single HDMI out from the video card and "split it" into two HDMI signals. This would then require me to have yet another external box, to convert the second HDMI video and audio into analog signals for the second monitor, so it would be ideal if this device not only split the HDMI out but also allowed for one of those split signals to go out HDMI and the other to go out in some form of analog video and audio, but how much can one ask for?

Am I missing something here, has anyone found a solution for this?

And, I've got to point out, external devices (or internal video cards) that cost several hundred dollars are generally not an option for me....
February 27, 2011 6:07:25 PM

tmaddison said:

OK, so I'm having the same problem, as I presume many are these days. Hard to imagine SOMEONE has not solved this yet.

Monitor 1 is an HDTV capable of full HD - 1080p display.

Monitor 2 ... will not display full 1080p.

I want the same output on both screens. This is called "clone" or sometimes "mirror" mode.

To get audio to the second monitor I use a program called VirtualCable ...

Am I missing something here...?



This is called "HTPC+HDTV Cloning" Features include cloned logon screens, automatic switching of the OS's default audio device to the audio output nearest the display in use, and automatic dimming of display not in use.

HTPC+HDTV Cloning is typically required where an office HTPC is in a different location than the lounge HDTV, and it is becoming more common as HDTV screen sizes get larger, DVI and Displayport cables get longer/cheaper, HTPC becomes standard in normal PC's and streaming TV off the internet (free and open iTV) availability becomes more familiar.

The best solution so far is Actual Multi Monitors and Coastal Audio Changer, but the developers of these applications have not cooperated to allow information to be sent between their applications, so the two apps need to be used separately.

The best wireless mouse so far is the old fashioned technology in the Gyro Mouse, and the best Universal Remote is the RF version of the Harmony One, but both of these are soon to be displaced by Smartphones with Universal Remote Emulators.

GoogleTV is still trying, but failing, to fix PC + TV integration, and if they eventually succeed, the Android emulator may be the one that stays the most useful.

Hope this helps. Cloning different resolutions is a hardware DRM nightmare, it could open the door to easier BlueRay ripping and slinging videos between devices that Hollywood finds threatening. I suspect the legal issues have stopped this obviously necessary functionality, and it will take a brave open source development team to tackle this one...
February 28, 2011 9:27:52 PM

Thanks, will give Actual Multi Monitors a try. UltraMon works fine for this to accomplish the basic purpose of sending the same video to two different monitors with two different resolutions, BUT it's so resource intensive that I can't get smooth video on my machine with it. Will see how Actual handles things.

Coastal just looks like an audio device switcher but what I need is something that sends the same signal out to two different audio devices simultaneously, not something that switches between the two.

VirtualCable does that fine, the only down side being the slight delay. Not sure how they'd avoid that, though, since by nature they need to intercept the audio stream and process it in whatever way they need to to send it to two devices...

Thanks again! Todd
!