Initally, I had no inclination towards dual monitor setup until I saw a friend using it and it made great sense to be able to do things across dual monitors - though my only interest is not just for dual screen gaming, but multi-tasking (video editing, gaming, netflix, etc) across dual monitors.
Unfortunately, I cannot afford another monitor at this time.
However, since I want to be able to use dual monitors later, I dont want to get struck with a card which cannot be tag-teamed later for optimal setting :-)
What is the best route to approach this scenario...which is bascially...
get a not expensive card now to be able to do stuff decently,
later when I can afford it, get another card and the addtional monitor to attain the optimal setting
for the little bit I kept reading about Eyefinity...most of it talks about 3 or more monitors (which I get .... you want 1 main monitor and other for peripheral vision). What I'm looking for is not for that sort of gaming (not in near future) but just 2 so that I can do gaming or movie on one and other for work... does Eyefinity support that?
okay... so it doesn't matter which modern graphics card you get, they all support dual monitor setup for both clone, Span or extended modes (not interested in the Hybrid Span mode).
So this setup works I guess,
for now -- monitor (1), 1 graphics card
for later -- monitor (2), to do dual monitor thingy
Now.. onto the future proofing part...
1. on nvidia site it says "All displays must have a common resolution, refresh rate, and sync polarity" -- does this sort of restriction apply to ati as well? (for regular non-3D monitors)
2. ( as the current trend ) if I get a 3D monitor for monitor (2), would any graphics card(s) support dual monitor setup, i.e., regular, 3D ?