Why 'Default Monitor'?

G

Guest

Guest
Archived from groups: microsoft.public.windowsxp.help_and_support (More info?)

My previous monitor has a hw problem so I've connected an old IIyama
VisionMaster Pro 400 instead. But it appears in Display
Properties>Settings and in Device Manager>Monitors as 'Default
Monitor'. It sees to be working OK, but presumably not at its optimum.
Why doesn't Win XP recognise it, and how can I get it to do so, or
manually change it please?

--
Terry, West Sussex, UK
 
G

Guest

Guest
Archived from groups: microsoft.public.windowsxp.help_and_support (More info?)

OK, thanks, managed to sort this a while later. The key step I'd
missed when using the Update Driver dialog was to uncheck the 'Show
compatible devices' box. Seems sort of counter-intuitive to me, as I'd
have thought I *did* want XP to use 'compatible' monitors! But anyway,
on removing that, a full list appeared, from which I selected my
IIyama VisionMaster.

Can't say it's made any visible difference (still using same
resolution and same 85 Hz refresh rate), but maybe there's some subtle
improvement?

--
Terry, West Sussex, UK
 
G

Guest

Guest
Archived from groups: microsoft.public.windowsxp.help_and_support (More info?)

Your monitor is probably too old to send config data back to the video
card. In the old day's (like what you just did) you'd have to manually
add the driver for the monitor for optimal display.