2 x DVI necessary?

Gabor

Distinguished
Mar 23, 2004
27
0
18,530
Hi all!

I am looking for a new video card, and would like to have two LCD monitors for my system. Do I need to buy a dual DVI card - like the Matrox Millenium P650 - or is it possible to use a card with 1xCRT and 1xDVI - like a Radeon 9600 XT - and plug one LCD monitor in the CRT ouput using a converter? Will I loose quality, refresh rate?

I do not need a very powerful card, as you can see.

Thanks,

Gab

<P ID="edit"><FONT SIZE=-1><EM>Edited by Gabor on 03/23/04 08:25 AM.</EM></FONT></P>
 
Well there arew a few things.

First what are you going to be doing with the card/monitors/computer (photo editing, video editing, or day to day surfing/word/excel etc)?

Second, do you already have the LCDs or are you going to buy them later? Do you NEED to connect 2 DVI LCDs, or are they hybrids at all?

Third, there is no DB-15/VGA to DVI adapter that I know of (but I haven't really looked for one) so you won't be able to connect a DVI-only LCD to a VGA connector.

Now DVI vs VGA quality really depends on the setup and quaity of the parts, and how willing you are to calibrate things. DVI offers the best out of the box image quality with little adjustments, and that's likely what you'll be looking for.
LCDs don't react well to noise, and this is another reason why DVI is good for LCDs.
One out of the box advanatage of VGA over DVI is refresh rate at high resolutions. DVI runs out of bandwidth quicker than VGA on most setups, and I don't think the entry level Matrox are dual channel DVI. But for most situations you'll find that DVI will handle 1600x1200 @ 60hz without issue.

Doing any converting along the way will introduce additional 'noise' to he equation. DVI offers a cleaner signal all the way to the LCD. This really only makes a BIG difference for people doing photo editing as the tolerances for colour and such are very extreme compare to other applications. Even video editing won't matter as much.

The thing is that really dual DVI can be turned into DUAL VGA, but not vice versa (AFAIK).

If you are looking at an entry level card I posted this the other day, and you may find it a little cheaper and better suited for your needs, although nothing really beats Matrox for 2D quality, and especially their implementation of DVI.

The HIS R9600-256mb w/ dual DVI.

<A HREF="http://www.bit-tech.net/review/305/" target="_new">http://www.bit-tech.net/review/305/</A>

nV also offers FX series cards with dual DVI, I know XFX and BFG has/had some.

And FX5200 with dual DVI <i>MAY</i> be your cheapest 'current card' solution.

I would still recommend the ATI if you're doing image applications.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Gabor

Distinguished
Mar 23, 2004
27
0
18,530
Well, thanks a lot for this complete answer!

This system will be a DAW (digital audio workstation). After working several years with just one monitor, and switching from windows to windows like crazy, it's just easier to have the sequence window, etc... on one monitor, and the mixers, etc... on the second monitor; as someone showed me recently.

That's why I don't need power (I don't play games), but just sharp and clean 2D graphic. I will probably go with the Matrox P650, this card doesn't have a fan, and I'm trying to build a quiet worstation.

Cheers!

Gab
 
Well the FX5200 series doesn't a fan either. But if you like the Matrox P650 and it fits your price range I'm not going to dissuade you, as I think they are the best solution, if only a little more expensive than the FX5200 series. I doubt the difference in price would equal the difference in quality though.

I think you'll be pretty happy with your choice. Of course you could do very well with 1DVI and 1 VGA (use it for your tool/mixers/files/etc) since the tools won't need a clear an image as any analyzers might. Just making sure you know that you don't 'NEED' dual dvi (unless the LCDs were already bought), but they are nice to have.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

Gabor

Distinguished
Mar 23, 2004
27
0
18,530
Just to make sure i got things right..

I have right now a 17" hybrid LCD monitor -DVI&VGA inputs- and plan to buy a 19" hybrid LCD monitor, as my primary monitor. So if i buy a "regular card" -like a FX 5200, etc...-, i would be able to plug the 17" CRT->VGA and the 19" DVI->DVI? If yes, i can save some bucks this way.

I clearly don't need the best image quality, specially on the 17" monitor.

Thanks for your answers, tgga!

Gab
 
Yeah with Hybrid monitors you could use a card with 1 vga and 1 dvi. For the best functionality and picture at that price I would recommend the R9600SE over the FX5200, but and FX5200 would do ok as well. Just don't go for anything less than an FX in the nV line since the GF4 and below have poorer image quality.

BTW, I'm assuming when you say CRT you mean DB-15 or VGA cnnector. Both your monitors are LCD not CRT right?

In any case it should be fine.

The FX5200 with nView may give you a bit better multi-monitor functionality right out of the box, but it's very close now, and most of the differences in features are gone, except for the one you won't be using anyways in spanning. With 3RD party drivers you can get more anyways.

So I see it like this, the R9600SE will give you better image quality, but the FX5200 will give you more features through nView. You are the only one who knows which is more important.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

TRENDING THREADS