My vote is with the Quadro. But that is assuming that the program you are using will not utilize the crossfire.
I think the software has to know how to use the dual videocards. If anyone knows more on this please feel free to correct me.
(after reading through that) be reminded that 4850 has 800 shader processors that work fully in most 3d applications (that I know – 3ds Max and Maya)
Also be aware that 99,9% of “professional” applications do NOT support SLI/ Crossfire. They never have and its doubtful at the moment if they ever will. the 0,01% of applications looks like there might be some that use it, however, I’m very skeptical about this. Unless you explicitly find that your application supports multi-card performance use, assume that it doesn’t.
Use SLI/ Crossfire only for gain in games.
In the end, if it were me, and I had ~350$ for a new card, I’d definitely get HD 4870 with 1gb of ram when it appears. If you’re not into waiting and my post convinces you (based on your personal experience to combine) you can settle for 512mb of ram easy-hearted.
However you look at it, you CANNOT find better performing graphic card on the planet for digital content creation than HD 4870. Once FireGL flavor of the card comes out, it will take the crown, but only in segments where you need more than 1gb of ram (and here, only if you buy 1gb+ variant)
from here, you can easily tell that quadro fx1700 is actually 8600GT variant (not even full speed 8600gt, let alone 8600gts). So it has 32 unified processors with significantly slower clocks than even 8600gt. You do the math: 32 vs 800 processors, feel free to include clock speed difference if you like.
(default for the 8600GT core is at 540MHz, while fx 1700 is 450mhz). HD 4870 is 750mhz. (be aware that shaders on nvidia cards work at higher rates than core- for quadro variant its just below 1ghz)
And I'm sure you'd like a Pentium D 3.6Ghz over a Core2 at 2Ghz. Higher number is better, right? Yes, 800 of ATi's "shaders" are better than 32 of nVidia's, but would only be equal to about 160 of nVidia's. You can't just spit out numbers.
All those "workstation" cards are just bad deals, for anything. It used to be that many professional programs are not fully compatible with mainstream cards, but that has changed long ago. Workstation cards still benefit from some optimizations. Although nowdays, the gap in raw power is so huge that mainstream cards perform far better regardless. Remember, it's not "gaming" cards, but rather mainstream cards.
Grimble_Cr umble, please read and try to understand. At least get the facts first before educating others.
And if you still feel like getting a quadro, go ahead: it’s neither my money you’ll be spending, nor my time you’ll be wasting on waiting for quadro to update viewports.
And I'm sure you'd like a Pentium D 3.6Ghz over a Core2 at 2Ghz. Higher number is better, right?
Hey, I did mention that 4870 processors work on 750mhz and that fx1700 work at nearly a 1000mhz. Still its 750x800 vs 32x1000 (not that it is full 1000).
Yes, 800 of ATi's "shaders" are better than 32 of nVidia's, but would only be equal to about 160 of nVidia's.
This is true only in extreme cases where a game is optimized to work only on TWIMTBP logic. In digital content creation (dcc) full 800 shaders are being utilized to their full potential, it just smokes anything nVida has to offer. No professional program is written with nVidias money(unlike some games). And like I said earlier, read at the very least, and preferably know before educating others.