Sign in with
Sign up | Sign in
Your question

2 card / 3 monitor questions

Last response: in Graphics & Displays
Share
November 16, 2010 9:54:57 PM

Up till last week, I had planned on getting a phat ol eyefinity card to run 3 monitors, giving me the convenience of just one card, to keep stuff simple (note I'm not actually interested in spreading a game over 3 monitors. I just was tempted by the convenience of having one card to rule them all, etc).

Yikes, I dodged a bullet there: I found out that eyefinity boards require that all monitors be the same resolution. Phew, glad I found that out!

So, I'm back to plan B of 2 cards running three monitors, and I'd like to get some clarity on a thing or two.

I won't be doing any sli/crossfire. My plan is to have a 1920x1200 monitor sitting in between two 1600x1200 monitors. I'd like to use a wussy card to drive the two 1600x1200's, and then get a good card to run the middle main monitor (ie, the larger middle monitor would be my gaming monitor/card).

__________________________________

So! The questions:


* People generally say that for this kind of setup, use two of the same brand of cards (I'll be going with ATI), to minimize driver wierdness. Can I rely on that? Is that a well-explored setup, and I'm good to go, or are there some stumbling blocks in front of me?

* Is a video card OK with running two monitors that are not side-by-side? IE, in my proposed setup, the secondary card would be running monitors 1 and 3. Is that groovy or verboten?

* Do many cards do 2 dvi jacks, instead of 1 dvi and 1 i-forget-the-name-of-the-other-type ? It always irked me that the look of two monitors is slightly different depending on the jack, so it would be neat to just go all dvi, for example. Any thoughts there?


On the off chance it matters in a way I don't know about, I'm currently looking at an Asus P6X58D-E mobo for now.

Any insights appreciated, thanks!



<edit> Please feel free to address just the questions you feel like - no need to feel compelled to answer the whole shebang ;)  Thanks!

More about : card monitor questions

a b C Monitor
November 16, 2010 11:18:14 PM

I'm pretty sure Eyefinity can handle resolutions that are different. Doing so will be ugly has h3ll so if it can't do it it was disabled on purpose. If you have a link that talks about this I'd be happy to see it.

Second, what you want to do with having one card do monitor 2 and another do 1 and 3 is fine. I've seen this been done before. The problem is what you want to do won't work ala Eyefinity. You'll have three separate screens and all usable in windows, but the game won't span across all three. In the olden days the Flight Sim people used to go in a mod .ini files to get their game to span 3+ monitors, but from what I understand not all games support this. This is the beauty of Eyefinity. It removes having to hack files to get this to work.

You'll have to look around for dual DVI jack cards. I'm sure they are all made. Another thing to consider is buying a card with a DMI to DVI adapter.
m
0
l
November 16, 2010 11:59:11 PM

Heya 4745454b, thanks for the reply.

I'm glad you zeroed in on the Eyefinity thing: I realize my post wasn't quite clear as to my thinking. I don't have any interest in stretch-screen gaming - instead, I was drawn to Eyefinity just for the simplicity of having one card for 3 monitors: nice and simple.

Re: must-be-same-resolution for eyefinity, someone pointed me to this faq:

http://www.amd.com/us/products/technologies/amd-eyefini...

"All monitors running in a Display Group or cloned modes must be running with the same resolution. If monitors have different native resolutions, the highest common non-native resolution between the monitors will be used when creating Display Groups. Monitors running in extended desktop mode can have independent resolutions."

Reading it again, I admit I'm getting a little confused. If the goal is to have a desktop spread over three monitors, but use only one of those monitors for gaming (the middle one), does that mean I can just use it to create 3 monitor groups of one monitor each, perhaps? Of course, at that point, it might call into question why I would even bother getting an eyefinity board (outside of the lots of jacks deal, letting me use one card. IE, seems a wasteful way to do 3 monitors / 1 card, at that point heh).
m
0
l
Related resources
November 17, 2010 12:03:29 AM

Elaborating on the eyefinity monitor resolution question.

If you want all monitors to be just like one giant desktop (taskbar runs across all 3 monitors) then the resolution has to be the same on all of them. If you just have the 2nd and 3rd monitors just extended, they can be any resolution.
m
0
l
November 17, 2010 12:09:29 AM

alextheawesome said:
Elaborating on the eyefinity monitor resolution question.

If you want all monitors to be just like one giant desktop (taskbar runs across all 3 monitors) then the resolution has to be the same on all of them. If you just have the 2nd and 3rd monitors just extended, they can be any resolution.



Hmmm myes. Yeah, my goal would be monitor 2 (the big middle monitor) as my primary monitor, with monitors 1 and 3 (smaller ones on each side) as secondaries. The taskbar would be on the middle monitor.

If I can do that, it brings up the interesting question of whether it's worth using an eyefinity card pretty much solely because it has more than 2 jacks on the back (vs just doing a 1-crappy-card / 1-good-card) deal. Man, that's a toughie. Truly, I am in over my head heh.
m
0
l
a b C Monitor
November 17, 2010 12:23:16 AM

Quote:
Monitors running in extended desktop mode can have independent resolutions.


I assume this to mean as long as you don't put them in a 3x1 group you'll be ok. The biggest issue you'll have is needing the DP to DVI adapter. We have the new single link adapters, so you should be good up to around 1050 and maybe 1080 with one of those. It will add another $40-50 to the cost however.
m
0
l
November 17, 2010 12:34:03 AM

4745454b said:
Quote:
Monitors running in extended desktop mode can have independent resolutions.


I assume this to mean as long as you don't put them in a 3x1 group you'll be ok. The biggest issue you'll have is needing the DP to DVI adapter. We have the new single link adapters, so you should be good up to around 1050 and maybe 1080 with one of those. It will add another $40-50 to the cost however.



Glad you brought this up, as the profusion of ports and wotnot in the last 5 years makes me head spin. If I went eyefinity, the card I would go with comes with sufficient adapters for my monitors (two dvi's i have, and a dp new monitor i will get).

But, you're scaring me with talk of max resolutions. Can you expand on that?
m
0
l
a b C Monitor
November 17, 2010 12:49:32 AM

The single link Display Port (DP) to DVI adapters are cheaper then the old active adapters that you used to need, but because they are single link they lack the bandwidth needed for the really high resolutions. The normal 20-22inch monitors which are 1680x1050 should be fine, I'm less sure of the 24"+ monitors that are 1920x1080. I'm fairly certain that anything higher then that is out.

If your buying a new DP monitor that don't worry about it. Plug the DP monitor into it, and use the DVI ports.

Edit: Doing a quick search on newegg showed many cards that have 1DP and 2 DVI. Here is one example.

http://www.newegg.com/Product/Product.aspx?Item=N82E168...

Its a HiS 5770 for $143 shipped, $113 after rebate. Has the ports I mentioned. There are also 5750s, 5830s, 6850s, etc. Returned 3 pages of results. Has to be something there you'd want.
m
0
l
November 17, 2010 1:00:19 AM

You don't have to worry about sli or identical Nvidia graphics cards. The main issue with drivers is that both cards need to use the same driver. Trying to use two different nvidia class drivers usually cancel one out and one card is driverless or picks a base operating system driver if it's available with very limited options and settings. What Nvidia video cards were you looking to use? Mid to high range cards of recent years usually come with 2 dvi connectors which I also feel are superior to the analog 15 pin d-sub connector. When you decide what you main high performance card will be, find the driver on Nvidia's website and look at the release notes to see what other video cards are supported by the driver. Then you can pick one of those as your secondary video card.

I went through trying to rig up surround gaming a while back, before sli and it was a nightmare. Recently Nvidia gave us surround gaming support because of Ati's Eyefinity. So much easier now lol
m
0
l
November 17, 2010 4:34:44 PM

Thanks for the comments, tas11 - I hadn't thought of the issue of specifically tracking down drivers/cards, outside of a general "use the same brand."

I'm actually going ATI, btw.
m
0
l
November 17, 2010 6:13:26 PM

I've given this a bit more thought, and have decided to just go with 2 cards instead of eyefinity. While I could make what I want work with an eyefinity board, it sounds like, I feel like I would be putting all my eggs in one basket (and be getting a minidp-only card that is a massive power hog, heh).

So, with that in mind, taso11 made a comment that caught me: is simple having 2 cards by the same manufacturer enough on the driver front to be safe? Or would I need to drill down deeper to identify a set of cards that can safely work with the same drivers installed? (Thanks for mentioning that, btw taso).




<edit> adding this info should anyone stumble across this thread with similar questions: I asked on the ATI forums, and got the info that the driver issue for 2 ati cards is fine as long as both cards are 2xxx series or later.
m
0
l
November 17, 2010 11:48:11 PM

The cards can be any manufacturer, (EVGA, Gigabyte, MSI etc.) but need to be both Nvidia based or both ATI based. For example, if you go here http://www.nvidia.com/object/win7-winvista-32bit-260.99... and click on the "Supported Products" tab you will see all the video card models that use the same driver. So it looks like 6XXX, 7XXX, 8XXX, 9XXX, 1XX, 2XX, 3XX and 4XX are pretty much compatible in that they'll have the required drivers for those cards.
m
0
l
a b C Monitor
November 17, 2010 11:55:56 PM

I would get the same generation cards. That way you know they will support the same driver. Using Taso's example, the 6series and 7 series cards are likely to drop off that list early as they don't use a unified shader architecture. Driver improvements for current cards are likely to come at the expense of those cards. I had this happen with my x1800XT. When the x1900 and x1950 cards were out all was fine. But when the 3xxx came out AMD dropped support for the x18/19 cards.

Quote:
getting a minidp-only card that is a massive power hog, heh


The 5770 is pretty power not hungry. 80-90W I think is the number normally used. Just make sure you don't get two 50W 5670s, because then your at 100W instead of 90W. Not saying this is the wrong way to go, but keep it in mind.
m
0
l
!