DisplayPort is the graphics card industry’s new favorite connector, since it guarantees high scalability for upcoming display solutions. There’s just one catch. How do you establish a new connector if the monitor makers aren’t willing to play along, and the majority of users have only just made the switch from VGA to DVI? Many folks may not even quite know what to make of HDMI yet, much less DisplayPort.
Nvidia’s approach is a cautious one, and while the company equips the Quadro 5000 with two DisplayPort connectors, it also provides a single dual-link DVI output. However, unless your monitor is already compatible with DisplayPort, you’ll still need to buy additional adapters if you’re planning to use a multi-monitor setup.
The memory system has also undergone an evolutionary change in that it now supports ECC (error correction code), making the Quadro 5000 the first card with this capability. The technique is not necessarily all that relevant to image processing. However, it is of great importance in medical analysis, financial computation, and cluster-based configurations. Even small single-bit errors can have a tremendous impact on the final result. ECC allows the graphics card to detect and correct this type of error, just like server and workstation motherboards can with system memory. The downside is that it results in a performance penalty. By default, Nvidia deactivates this feature in its drivers.
Nvidia also provides a 3-pin DIN port for use with 3D shutter glasses on the card’s backplate. The company already has compatible wireless solutions in its product portfolio as well.
Feature-wise, Nvidia is competitive with AMD once again. Shader Model 5, DirectX 11, OpenGL 4.1, and OpenCL 1.0 are all finally supported in this generation of GPUs after several delays. Special solutions like Framelock, Genlock, and Serial Digital Interface required by the broadcast industry are also provided by this card.
- Introduction
- Comparisons And Applications
- Nvidia Quadro 5000: Overview
- Nvidia Quadro 5000: Features, Connectors, And Driver
- ATI FirePro V8800: Overview
- ATI FirePro V8800: Features, Connectors, And Driver
- Test Configuration
- Benchmark Results: SPECapc Autodesk 3D Studio Max 9 1.2
- Benchmark Results: SPECapc Autodesk Maya 2009
- Benchmark Results: SPECapc Newtek LightWave 9.6
- Benchmark Results: SPECviewperf 11
- Conclusion





For someone who doesn't do 3-D design these benchmarks are kinda confusing.
For someone who doesn't do 3-D design these benchmarks are kinda confusing.
(or have I sped-read past the reason why
Hence why I'm selling my HD5770 and getting a GTX460. Much as I like their hardware, ATI sucks balls on drivers...this card won't even shine on M&B and BF2 is a nightmare.
Why do you even want to compare 2 different cards that have different price range ? At least in my country GTX460 costs almost twice as much as 5770. I wonder why nobody can force Nvidia or AMD to bring the workstation optimization found in Quadro - FirePRO drivers to normal cards ... we all know about the past Quadro mods from normal gaming cards ... most of the time all that differes between the 2 cards is amount of memory.
Because then Nvidia wouldn't have their Quadro lines would they?
It's mostly for money, they just change a product a bit and market it as a completely different thing, this rakes in more money, and i know you can turn GTX2** Series card's to Quatro's because iv'e turned my GTX285 into one before.
what teh ehck you mean ? lol i'min school for gameart design work in 3ds max 2010 all teh time, and i still can;t make much sense of tom's benches here , are tehy mesuring in render time or what ?? who the f--- they get the scroes ect ect , i want to see actual render times , would i benfit at all , if i replaced my gaming card with one of these ? sorry toms but epic fail on this comparison this time , why on earth you show 3ds max render tiems for comercial card benches but not work station cards is beyond me. just makes no sense, especially sicne consumer graphic cards DO NOT make a damn difference in 3ds max because when you use a comercial vid card all renders are done on the cpu not the gpu.
A true statement if i ever heard one, since AMD merged ATI and fired lots of ATI personnel.
what is it, not what is it more or less