What the difference between the nVidia Quadro line and say their GTX line? Or AMD's Fire[insert model suffix] vs. their 4800's? Are the GPU's in both cases the exact same GPU? Or are workstation GPU's a different can of worms? And why do I often read that workstation cards are not fit for gaming at all. I mean they cost like 600-1200 dollars!!! That's more then the 5800 series Radeon's. So I was just looking for some clarification on the differences in mainstream individual consumer video cards, and (generally) industry used workstation cards.
Physically/hardware the workstation cards compared to gaming cards are like 99.9% the same.
Software wise they are completely different. Gaming cards run on DirectX and workstation cards use OpenGL.
Generaly if you are gaming, use a gaming card and if you are rendering, use a workstation card.
Just because the workstaiton cards cost more than the gaming cards dose not make them better than gaming cards! (for gameing) Using a workstation card would provide worse performance than the equivilant gameing card, similar performance at best. card conversly a workstation mops the floor with a gameing card when used for rendering, providing 3 to 5 times more performance. http://www.tomshardware.co.uk/amd-firepro-v8700,review-...
this compairs the Radeon HD 4870 and FirePro V8700 in a rendering program.