Sign in with
Sign up | Sign in
Your question

Geforce2 MX and DVI Problem

Last response: in Graphics & Displays
Share
Anonymous
August 1, 2005 11:23:50 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Hi all - I'm nearly at my wit's end here . . . any help would be
greatly appreciated!

I'm having problems trying to display XGA (1280x1024) through the DVI
output from my Nvidia GeForce2 MX video adapter.

I have a Sony Vaio with an Nvidia GeForce2 MX (w/32MB video memory)
running Windows ME (don't ask)

I recently upgraded from my Samsung 1024x768 display to a new BenQ
FP17E+ which runs at a native resolution of 1280x1024. It has both VGA
analog and DVI inputs.

I was running my Samsung monitor via the DVI output of the display
adapter (after having updated the driver way back when). I attached
the new BenQ monitor and installing the monitor's driver (profile) that
came on CD. The system seems to correctly recognize the new monitor
(as reported by the display control dialog).

The Problem
-----------
When I run the display via the *analog* VGA output of the display
adapter, it works fine (I can go up to the full 1280x1024 resolution).
However, when I use the display settings to select the DVI output of
the adapter, it does not allow me to select up to the native 1280x1024
resolution.

So, I updated the driver to the latest Forceware driver, version 77.72.
It has the same issue.

Then, I tried installing an earlier revision, "Detonator" driver
version 30.82 and that actually showed 1280x1024 as an available
resolution. However, when I selected it and it did the 15 second
screen test, the screen went blank.

So, now I am left having to drive the display via the VGA analog output
which is
maybe not the worst thing in the world, but I'd like to get the DVI
ouput working.

Any advice would be greatly appreciated.

Thanks,

-MB

More about : geforce2 dvi problem

Anonymous
August 2, 2005 10:16:54 AM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

<somewildmonkey@yahoo.com> wrote in message
news:1122949430.614597.34120@g43g2000cwa.googlegroups.com...
> Hi all - I'm nearly at my wit's end here . . . any help would be
> greatly appreciated!
>
> I'm having problems trying to display XGA (1280x1024) through the DVI
> output from my Nvidia GeForce2 MX video adapter.
>
> I have a Sony Vaio with an Nvidia GeForce2 MX (w/32MB video memory)
> running Windows ME (don't ask)
>
> I recently upgraded from my Samsung 1024x768 display to a new BenQ
> FP17E+ which runs at a native resolution of 1280x1024. It has both VGA
> analog and DVI inputs.
>

It takes a 64 meg card to do 1280 X 1024 X 32 bit color ( 40 meg mininum /
frame). You were almost at the max for that card when you where running
1024 X 768 X 32 bit color (24 meg).
August 2, 2005 5:44:23 PM

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"PhxGrunge" <noname@qwest.net> wrote in message
news:hnKHe.225$yr4.1287@news.uswest.net...
>
> <somewildmonkey@yahoo.com> wrote in message
> news:1122949430.614597.34120@g43g2000cwa.googlegroups.com...
>> Hi all - I'm nearly at my wit's end here . . . any help would be
>> greatly appreciated!
>>
>> I'm having problems trying to display XGA (1280x1024) through the DVI
>> output from my Nvidia GeForce2 MX video adapter.
>>
>> I have a Sony Vaio with an Nvidia GeForce2 MX (w/32MB video memory)
>> running Windows ME (don't ask)
>>
>> I recently upgraded from my Samsung 1024x768 display to a new BenQ
>> FP17E+ which runs at a native resolution of 1280x1024. It has both VGA
>> analog and DVI inputs.
>>
>
> It takes a 64 meg card to do 1280 X 1024 X 32 bit color ( 40 meg mininum /
> frame). You were almost at the max for that card when you where running
> 1024 X 768 X 32 bit color (24 meg).
>
Not so - 1280x1024x4 (4 bytes, not 32 bytes) requires just over 5MB. Don't
know why DVI won't work, but it's not lack of video memory.
Regards,
Steve.
!