Nvidia 5700 ultra and 1920 x 1200 resolution?

Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

I have the Nvidia 5700 Ultra 128MB and just purchased a 23" Sony monitor
which has 1920 x 1200 resolution. My previous monitor was a 19" and the DVI
connection was superb with razor sharp fonts.
The problem is that to be able to the 1920 x1200 resolution on the new
monitor, I had use the VGA cable and the difference in quality is quite
noticeable to the point that I am seriously thinking of returning the
monitor. The strange thing is that before purchasing the new monitor, as a
test I changed the resolution on my old monitor to 1920 x 1200 and it
worked, however when I connected the Sony it was not possible to accept the
resolution under the DVI connection!
Someone had suggested the option of being able to hack the monitor's inf
file so to be able to connect the DVI cable and get the higher resolution
and digital connection but I do not know how to do it and would appreciate
any feedback or suggestions if there is a way to bypass the 1600 x 1200 DVI
barrier.
4 answers Last reply
More about nvidia 5700 ultra 1920 1200 resolution
  1. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    You need to hack your monitor's .inf file. (make sure you have a backup of
    your original .inf file)

    Open up your .inf with notepad (or any text editor)

    Change all "1600" in the file to 1920. (assuming that your monitor can
    support DVI @ 1600x1200) if not then you'll have to change the "1024" to
    1200 as well. Save the file.

    Then reinstall the monitor with this new .inf file.

    I can't guarantee this will work. It did work for me and my Samsung 240t
    which also has (had) this same problem

    The reason for this problem with DVI at resolutions beyond 1600x1200 is the
    DVI spec does not officially support them. This is because the bandwidth
    (~165 Mhz) required for 1920x1200 and higher is more than a single DVI port
    can handle @60Hz anyway. Sad but true.

    A side note is to make sure you have the current nVIDIA drivers because I
    know at one time the drivers themselves wouldn't output DVI @ 1920x1200 no
    matter what the monitor .inf tells the computer it can handle. I have read
    that the drivers do not have this limitation anymore.

    Regards,
    Jon

    "Vassik" <vassi12nospam@comcast.net> wrote in message
    news:4%jfd.316449$3l3.257172@attbi_s03...
    >I have the Nvidia 5700 Ultra 128MB and just purchased a 23" Sony monitor
    >which has 1920 x 1200 resolution. My previous monitor was a 19" and the DVI
    >connection was superb with razor sharp fonts.
    > The problem is that to be able to the 1920 x1200 resolution on the new
    > monitor, I had use the VGA cable and the difference in quality is quite
    > noticeable to the point that I am seriously thinking of returning the
    > monitor. The strange thing is that before purchasing the new monitor, as
    > a test I changed the resolution on my old monitor to 1920 x 1200 and it
    > worked, however when I connected the Sony it was not possible to accept
    > the resolution under the DVI connection!
    > Someone had suggested the option of being able to hack the monitor's inf
    > file so to be able to connect the DVI cable and get the higher resolution
    > and digital connection but I do not know how to do it and would appreciate
    > any feedback or suggestions if there is a way to bypass the 1600 x 1200
    > DVI barrier.
    >
  2. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Thanks so much I shall try this and report back. I have the latest NVIDIA
    drivers. I do appreciate this feedback.


    "Jon Cortelyou" <jcortel@notsxpamno.hotmail.com> wrote in message
    news:mIlfd.287$zx1.87@newssvr13.news.prodigy.com...
    > You need to hack your monitor's .inf file. (make sure you have a backup
    > of your original .inf file)
    >
    > Open up your .inf with notepad (or any text editor)
    >
    > Change all "1600" in the file to 1920. (assuming that your monitor can
    > support DVI @ 1600x1200) if not then you'll have to change the "1024" to
    > 1200 as well. Save the file.
    >
    > Then reinstall the monitor with this new .inf file.
    >
    > I can't guarantee this will work. It did work for me and my Samsung 240t
    > which also has (had) this same problem
    >
    > The reason for this problem with DVI at resolutions beyond 1600x1200 is
    > the DVI spec does not officially support them. This is because the
    > bandwidth (~165 Mhz) required for 1920x1200 and higher is more than a
    > single DVI port can handle @60Hz anyway. Sad but true.
    >
    > A side note is to make sure you have the current nVIDIA drivers because I
    > know at one time the drivers themselves wouldn't output DVI @ 1920x1200 no
    > matter what the monitor .inf tells the computer it can handle. I have
    > read that the drivers do not have this limitation anymore.
    >
    > Regards,
    > Jon
    >
    > "Vassik" <vassi12nospam@comcast.net> wrote in message
    > news:4%jfd.316449$3l3.257172@attbi_s03...
    >>I have the Nvidia 5700 Ultra 128MB and just purchased a 23" Sony monitor
    >>which has 1920 x 1200 resolution. My previous monitor was a 19" and the
    >>DVI connection was superb with razor sharp fonts.
    >> The problem is that to be able to the 1920 x1200 resolution on the new
    >> monitor, I had use the VGA cable and the difference in quality is quite
    >> noticeable to the point that I am seriously thinking of returning the
    >> monitor. The strange thing is that before purchasing the new monitor, as
    >> a test I changed the resolution on my old monitor to 1920 x 1200 and it
    >> worked, however when I connected the Sony it was not possible to accept
    >> the resolution under the DVI connection!
    >> Someone had suggested the option of being able to hack the monitor's inf
    >> file so to be able to connect the DVI cable and get the higher resolution
    >> and digital connection but I do not know how to do it and would
    >> appreciate any feedback or suggestions if there is a way to bypass the
    >> 1600 x 1200 DVI barrier.
    >>
    >
    >
  3. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    Jon Cortelyou schrieb:
    > A side note is to make sure you have the current nVIDIA drivers because I
    > know at one time the drivers themselves wouldn't output DVI @ 1920x1200 no
    > matter what the monitor .inf tells the computer it can handle. I have read
    > that the drivers do not have this limitation anymore.

    Only with standard blanking intervall, a relict of the CRT age. The 240T
    doesen´t support a lower blanking intervall (so it goes down to 52Hz)
    but other 1920x1200 TFTs do so.

    Denis
  4. Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

    "Denis Freund" <denisfreund@cityweb.de> wrote in message
    news:2u7vcpF27noefU2@uni-berlin.de...
    > Jon Cortelyou schrieb:
    >> A side note is to make sure you have the current nVIDIA drivers because I
    >> know at one time the drivers themselves wouldn't output DVI @ 1920x1200
    >> no matter what the monitor .inf tells the computer it can handle. I have
    >> read that the drivers do not have this limitation anymore.
    >
    > Only with standard blanking intervall, a relict of the CRT age. The 240T
    > doesen´t support a lower blanking intervall (so it goes down to 52Hz) but
    > other 1920x1200 TFTs do so.
    >
    > Denis


    Thanks for the reply. Another monitor I am considering (because it looks
    like I may have to return this one) is the Apple 23" do you know if this
    option might work on that one?
    Thanks so much
Ask a new question

Read More

Nvidia Resolution Monitors Graphics