To start with, I'm not a newbie computer user, it's just that I picked up everything I know from random places.
This is not the first time I search for an answer to this question, I can't find a clear answer, just that the Quadro and workstation cards have special drivers and are designed to perform different calculations, math and algorithms needed in programs like Maya and 3Ds Max. So I would like to subdivide my question in to 3:
1) Does rendering your final completed, for example, Maya project into a video file depend on your CPU or your GPU?
According to my knowledge, the time taken for rendering an animation into a video clip depends on your CPU, is that correct or would a workstation card play a role in rendering, so is it Dependant on CPU, GPU or both and how and why?
2) If you play a demanding PC game on a mid range Quadro it will lag; however on a gaming card like a GeForce, it will run well, so by running poorly we mean having a drop in framerate making the game unplayable and useless.
What about using a GeForce for Maya, for example, what would be the performance issue, in games it's lag, so in Maya it's what?
3) How and in what ways would a workstation card improve the performance of such programs?
That summed it up but to shorten it down to a simple post is bios and drivers. The gpu cores and the rest of the card is the same as that found on the normal market except for frame lock features ect. The bios has very minor changes while the drivers are scripted to improve performance in certain applications for 2D and 3D work such as AutoCad or 3DSMax serving as two examples. Application performance is nerfed on the normal consumer cards to help maintain such a niche.
I know about the drivers, but does rendering depend on you GPU too? by that I mean rendering your completed project into a video file. It takes me an hour to render a moving box on a blank background for 1 minute.
The difference is that one line of cards has drivers that favor certain apps and are meant for work uses only. The cards are physically the same down to the transistor. The only differences there are small usually in the cooler and pcb. The cores are the same and with Nvidia they usually use samples that wouldn't have made it in the normal consumer market due to disabled units ect in some examples. I don't get how this is a difficult subject for some when one can just sit down and search around online and it is nothing like trying to dig up info on the nCube2 or any thing rare like that.
February 8, 2011 2:15:23 PM
Ok, but my question:
DOES RENDERING DEPEND ON GPU?
Ding you just figured it out. The professional workstation cards from the hardware is mostly pure profit while the normal consumer cards are what pays the bills. The only real cost in the end is the development of the drivers. Yes you can try to flash a card to work as a firegl or a quadro but not are all going to work. That part of the subject isn't easy to sum up in a short comment but you used to be able to take a normal card and by software enable these features without any mods or cost.
February 9, 2011 2:30:20 PM
I knew it, a GPU would never cost $3000, I'll try some experimental flashing of my older cards; I found a few hacks on the Internet, anyways, whatever that's all I needed to know.
As for the difference between a quadro and a geforce model, the former serves a more accurate and wider color output, has very low power usage and bears excellent service & support.. Thus, it gets recommended for industrial usage where downtime needs to be as minimum as possible.. It resembles no practicality to get a workstation card for a single user workstation scenario.. The regular geforce will do just fine..