Archived from groups: alt.comp.periphs.videocards.ati (More info?)
I have a radeon 9200se card installed on win xp sp2 and have two philips lcd
screens (190s5cb) installed.
Both screens have only VGA input. The card however, has one VGA and one DVI
(DVI-I I guess...) output.
As I've read on many newsgroups, a lot of people, including myself, have
quality issues with the DVI to VGA adapter
on their LCD screens.
As far as my reading-up goes, DVI-I or DVI-A should give a slightly better
quality, despite the conversion to analogue VGA.
However, my second screen (which is the one on DVI) is rather blurry. Enough
to give you a headache if you try
to read on it for more than five minutes.
Could this be the card, ie. the DVI-VGA circuitry on board doesn't to it's
job very well? Or what?
I've been looking for dualhead VGA cards, but can only find the matrox g450
(which I feel is a bit simple).
So, second question is: if the card is the culprit, what new card should I
look for? It has to be dual VGA and I do no gaming, but quite a lot of 2D
design and video editing using Premiere Pro and just the occasional 3D
design in 3dMax.
Note the secondary display gives slightly worse image quality.
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."
"Tim" <email@example.com> wrote in message
> I know the converter just 'reformats' the connector, but I thought that it
> could be responsible for some quality loss.