Sign in with
Sign up | Sign in

Nvidia Quadro FX 5600

Pro Graphics: Seven Cards Compared
By

The Quadro FX 5600 is regarded as the OpenGL computing monster par excellence. Equipped with 1,536 MB of GDDR3 memory, the Quadro FX 5600 has sufficient reserves for high resolutions and large textures. The card is powered by the G80 graphics chip, which you can also find on several GeForce 8800-series gaming cards.

Currently the card is priced as low as $2,700 from online retailers. In benchmarks, the FX 5600 scores well and earns first place in many categories. However, the ATI FireGL V7700, at $1,000, is right on its heels, often beating it in some categories. ATI manages this by using a more modern chip architecture at 55 nanometers.

The add-on "Maxtreme 11" driver from Nvidia is also interesting. This plugin was developed specifically for 3D Studio Max, and leads to a significant performance boost in this program. In contrast to the previous versions, Maxtreme 11 supports OpenGL and also the DirectX API. The hardware shader operations of 3DSM especially benefit from it. But here we recognize that DirectX is slowly becoming acceptable in the workstation sector, which was previously reserved exclusively for OpenGL.

The 90 nanometer chip and large memory require a dual auxiliary power inputs, as you can see from the two Molex connectors (both 6-pin).

The screen shot of GPU-Z shows the most important technical data for the FX 5600.

Display all 54 comments.
This thread is closed for comments
  • 2 Hide
    zajooo , August 13, 2008 8:31 AM
    Hi guys,

    I was allways wondering, how it would look with comparison of these cards with gamers cards, wheather the gamer ones did not give us as we say "Lot of music, for a little money". Whould we get simmilar results in the field of geometry transformation, without using any alliasing a any of the really not necessary features ???

    Thanx
  • 0 Hide
    Evendon , August 13, 2008 8:47 AM
    I would also like to know how these cards handle games. Especially im interested in how mobile versions of quodro fx work with games.
    I would also like to play with my work laptop with quodro fx-chip (fx 3600m)
  • 1 Hide
    neiroatopelcc , August 13, 2008 9:29 AM
    I was pretty much wondering how low a geforce 8800gt or ati 4750 would stack? given the lowest prices in denmark for a v3600 or fx570 start at 1150kr and a 8800gt256 costs only 680kr (4750 @ 1070kr), they're comparable in the budget department.

    We're only doing basic grid stuff mostly at work, but we're using intel 945 chipsets with onboard graphics for that - the pentium dualcore seems to do that okay, but it isn't happy about textures at all. However for those things we use old p4's with geforce 6600gt cards - not quadro anything, and I can't help to wonder if using gamer cards is the right or wrong choice (worked great for us so far).


    ps. we use inventor, mechanical desktop and revit from autodesk (newest and second newest versions only), so only the 3dstudio results are interesting for me.
  • -4 Hide
    venteras , August 13, 2008 9:43 AM
    Where's the Quadro FX 3700 and 4700?

    I don't see how you can do a review dated 13 August that seemingly cover all current main stream cards and not have 2 of NVidia's primary cards included?

    Was this paid for by ATI?
  • 0 Hide
    sma8 , August 13, 2008 10:13 AM
    zajoooHi guys,I was allways wondering, how it would look with comparison of these cards with gamers cards, wheather the gamer ones did not give us as we say "Lot of music, for a little money". Whould we get simmilar results in the field of geometry transformation, without using any alliasing a any of the really not necessary features ???Thanx


    Pro graphic cards are different from gamers/consumer cards. Pro graphic cards are designed to be capable of handling workstations applications such as AutoCAD or 3D Studio MAX whereas gamers/consumer cards are designed for desktop pc apps or games. You can see the differences of both cards here:
    http://www.nvidia.com/object/builtforprofessionals.html

    That's a good example how both cards work in workstation application. That's why pro graphic cards cost very expensive
  • 1 Hide
    anonymous1000 , August 13, 2008 12:18 PM
    Hello. Thank you for the interesting article. What is also interesting is the huge gap between Spec results and my day to day experiences with ATI pro cards. When the first spec results showed up, I started recommending my clients to buy FireGL 3600-8600 cards but nfortunately, they where VERY pour performers in 3DSMAX work... apart from the fact that you couldn't even use half of their strength with the first drivers, even now, comparing the FPS count on the V5600 card with that of 9600GT on 3DSMAX shows it is much more comfortable to work with the latest.. If you want to move a large project on your viewport it moves allot faster if you have a 9600GT card installed. THIS is a benchmark I would like to see here because IT REALLY MATTERS to animators. Thank you
  • -3 Hide
    hixbot , August 13, 2008 12:19 PM
    nobofy?
  • -2 Hide
    bydesign , August 13, 2008 12:51 PM
    sma8Pro graphic cardshttp://en.wikipedia.org/wiki/Graphics_processing_unit are different from gamers/consumer cards. Pro graphic cards are designed to be capable of handling workstations applications such as AutoCAD or 3D Studio MAX whereas gamers/consumer cards are designed for desktop pc apps or games. You can see the differences of both cards here:http://www.nvidia.com/object/built [...] onals.htmlThat's a good example how both cards work in workstation application. That's why pro graphic cards cost very expensive


    Not true I can flash the bios on my 8800 GTX and it will run just like it's workstation cousin. They are using the same hardware but handicapping the consumer card.
  • 0 Hide
    theLaminator , August 13, 2008 1:21 PM
    Wish this article had been published about three days ago, it would've made my decision on which new laptop to get for school (i'm an engineering major, so i actually will use this). I finally decided on an HP with with the FireGL v5600, looks like a made the right choice based on these benchmarks. Guess we'll see when I actually get it and try it in application.
  • -1 Hide
    Anonymous , August 13, 2008 2:08 PM
    "I would also like to know how these cards handle games. Especially im interested in how mobile versions of quodro fx work with games.
    I would also like to play with my work laptop with quodro fx-chip (fx 3600m)"

    I also have a Quadro 3600M in my laptop. In 3dMark06, I get a score of 8800. COD4 and UT3 run smoothly with all settings maxed out at 1900x1200. Quadro cards are as good or better at gaming as their Geforce equivalents.
  • 4 Hide
    eodeo , August 13, 2008 3:56 PM
    First off, I have to say that this has been a very good review. I naturally, have several things I'd like to complain about. I mean, as a reader, its our god given right to complain and never be satisfied. Right?

    The good- you exposed and explained each GPU very nicely, noting everyone's gaming counterpart, included specifications for each card and commented on in with high detail.

    So if everything is so well and good, why do I complain?

    Simple- the tests. You're using the SPECheatTest. It’s well known that this test is optimized to show that even the crippled "workstation" cards outperform the much hardware superior "gaming" cards. The fact is, at least 3 programs you tested here today don't reflect the "bell" conditions posed by the SPECheatTest. Well, actually its 2. You haven't tested AutoCAD and I cant really comment on other applications as I am not familiar with them. The 2 applications I am familiar with and fully competent to speak on their behalf are 3ds Max and Maya.

    What do these 2 have in common other than being under the same roof now? DirectX support. ANY application supporting DX is not being crippled in drivers when running it. You will see huge leads of ATI cards over nvidia here. But I'm getting ahead of myself.

    In all tests you failed to mention that OpenGL is horridly slow, even on these “professional” cards. This is, of course, in the case when you can chose between OGL and DX in the same application. Not only is it slow, but its visually incomplete as it is lacking functions for visual quality display of lightning and shadow conditions that only DX 9.0c can display. I’m willing to forgive you the last one, as you maybe only tested the thing and not checked for visual quality differences. That’s not saying that SPECheatTest you used, can disply these real life conditions.

    So with these 2 things in mind, it’s easy to see that only idiots or people unable to use DX would use OGL instead. To be fair, you did say that nvidia’s 8800gtx... erm, I’m sorry I meant quadro fx5600 is the best OGL card- and, I agree. Nvidia uses archaic logic in their hardware and OGL fits that pattern perfectly. So, its no surprise that nvidia should win. If you have to use OpenGL for any reason, nvidia is your man... erm company.

    If however, you’re not stuck on appalling Mac platform or with archaic software that doesn’t support DX, it should be mentioned that ATI has significant lead. Not surprising either, since it has 3x more shaders at the same price. Games can’t use these well most of the time, but this is not the case with digital content creation (dcc) programs, like 3ds Max, Maya or AutoCAD.

    Speaking of games- You did mention that 8800gtx=5600fx is 2 generations old. You failed to mention that the latest gtx280 has 240 unified processors and you failed to add it to the tests. Not that I think that SPECheatTest would show that its 2x faster, but the fact of the matter – it is. For any of the 3 above mentioned programs, and likely all others that support DX (possibly OGL, but I’m not sure how crippled it is in drivers- more on this later)

    Which brings me to the ultra high end of dcc world- ATIs latest 48x0 cards. These have 800 unified shaders. They are just wiping the floor with all the cards mentioned in your article put together(!). All of them together (if it were possible) don’t have enough power to compete with even a single new card from ATI.

    You conclude that ATI is the best deal at 1000$, but you fail to notice and differentiate outdated OGL programs and new D3D. So the ATI card is the absolute undisputed winner of CURENT dcc. Crown can, in no stretch of imagination, go to nvidia- unless you mention that it’s for outdated and OGL programs exclusive- in which case it does get it. Also, I’d again like to mention that the fact that you’re using SPECheatTest, isn’t helping you build your case either. And in addition to all this, you also failed to mention that you have 10x more powerful card for less then 300$ - HD 4870. Difference is solely in drivers and than not every part of drivers, but just the OGL implementation.

    Which finally brings me to drivers: All the professional cards are 99% same as their gaming equivalents. They differ only in drivers. You said so, and I agree. What you failed to mention is that the professional cards are actually noticeably slower than their gaming equivalents. For stability they say. I challenge anyone to prove that gaming cards are less stable.

    There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.

    Problem with this nice theory is that NO dcc program available today can stress any of the cards mentioned here today beyond their framebuffer capacity. Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core.

    Which again, brings me back to drivers- as the only other cause for instability. Here is another interesting fact you might not have known: People writing drivers for the 99% identical cards don’t do it 2x. Process may vary, but in a nutshell it’s like this: they write the drivers for the gaming card. At this point they write 2 different software types in it, either games or dcc software. The dcc software drivers go to more testing on the dcc software and in 99% of cases it’s done there and published. It will be thoroughly tested to notify that there aren’t any major bugs and than shipped.

    The games driver path will not end there. The driver programmer has one more duty: to cripple performance under what he deems to be “professional” software(!!!). So to reiterate this: the programmer of drivers, instead of perfecting them to be better, actually sits down and starts writing core to CRIPPLE(!?!)the gaming line of the cards. One would image that he could spend his time employing his talents elsewhere.

    So, crippled or not- drivers are the same 99%. If instability is brought on one line of the cards, the other isn’t spared with "superior" drivers. So in, reality, the workstation cards are no more stable than their gaming siblings, even if many would like you to believe that.

    In conclusion, I’d also like to nitpick and the fact that you use very low resolutions for testing- capping at 1600x1200. As you might have guessed, anyone interested in working 3d will start at that resolution minimum. Not end on it. This is not a serious overlook as you have been using the SPECheatTest to test everything, so your results should be taken with a grain of salt anyway.

    Quote:
    The add-on "Maxtreme 11" driver from Nvidia is also interesting. This plugin was developed specifically for 3D Studio Max, and leads to a significant performance boost in this program. In contrast to the previous versions, Maxtreme 11 supports OpenGL and also the DirectX API. The hardware shader operations of 3DSM especially benefit from it.


    Maxtreme was once useful. Some 8 or so years ago. Ever since DX entered dcc programs OGL and Maxtreme and the likes of it have been dying. Last couple iterations of maxtreme were nothing more than DirectX with a fancy name to make qudro buyers happy and make them feel special. It held no visual or speed advantages whatsoever. Than again, it introduced no ill-effects ether.

    Honestly, I haven’t tried r11, but I seriously doubt it brings anything new (since its not technically possible). And as for OGL support in it, just goes to show how much they know about 3ds Max. With OGL you cannot enable viewport shadows or any of the advanced viewport lightning techniques possible only in D3D. So, as I said before, OGL is not only seriously slower (and I mean seriously) its also lacking much in the visual quality department as well. I’ve said it before and I’ll say it again: only idiots or people unable to use DX will opt for OGL.

    Quote:
    But here we recognize that DirectX is slowly becoming acceptable in the workstation sector, which was previously reserved exclusively for OpenGL.


    And when you say slowly you sound like its 2001 all over again. Newsflash its 2008- DX has been a defacto standard for about 7 years now. True- not all dcc will be reflecting this- like AutoCAD that had its first DX support added with launch of Vista. But let’s not kid ourselves here- these software are really only crippled version of actual dcc leaders like Max, Maya, XSI, Lightwave... And let’s be serious for a moment- those software work beautifully with 4 generations old hardware because of their inherent purpose.

    Thank you for reading,

    That was short... maybe I should have published it :p 
  • 2 Hide
    Anonymous , August 13, 2008 6:32 PM
    Also why test AutoCAD for 3D use? Inventor is the Audodesk 3D modeling product. AutoCAD is so 10 years ago.
  • 0 Hide
    Anonymous , August 13, 2008 8:17 PM
    hi !
    sorry to borrow you, but as you and reader explain, cards "pro3d" and "gamer" are about the same.
    and since you will get only 1 per system, it will be VERY INTERESTING to tests "pro3d" and "gamer" cards with all 3D AND game benchmarks you get.
    because, when you bought a workstation for personnal use (i know a lot of friends doing this), you like also to relax you by playing a game !

    and i'm pretty sure that you will see that the top-end current "gamer" cards (with standard drivers) are much more bang-for-the buck that their old "pro3d" equivalent (with specialized drivers).

    an other example is Apple, dell, or HP selling you quadrofx card in systems designed for... video and multimedia works. why ? pretty sure that a gamer card will do the same job (or better) for less money.


    fredsky
  • 0 Hide
    Anonymous , August 13, 2008 8:26 PM
    and also a comment about "certified drivers" and "high 3d applications"

    1) as @eodeo wrote "Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core." this so a SHAME man !!! common, we got 4 vieports and plenty of cores. autodesk and others, WAKE UP, and use one core per each viewports !!!

    2) i was working to certify autodesk 3Dmax with nvida cards/drivers. in fact, you have to take all the drivers avaible for the quadro, and tests them (scripts-driven benchmarks). and then you will approuve or reject a driver, quality wise. so you get OLD hardware and also OLD drivers... so expensive work to got finally CRIPPLED things.

    fredsky
  • 0 Hide
    snipster4 , August 13, 2008 8:33 PM
    This should be tested in Vista 32bit/64bit. I know a lot of CGI artists that are using Vista with DX10. Almost all Professional apps run under Vista DirectX10 ie AutoCAD 2009, Inventor 2009.

    I have 5 Quadro 1500 cards running under XP 32bit and Vista 64bit. My one gaming machine with 8800GT OC card under Vista 64bit kills all my other computers running Quadro cards in AutoCAD 2009, 3DS Max 2009, Inventor 2009.

    Pro cards are no use anymore as they are optimized for OpenGL and most software manufactures are removing/or limiting support for OpenGL case in point Vista.

    I wish I would have saved my,$3,500 in Quadro cards and purchased more 8800GT or even ATI 4870X2. Note not much different from Quadro 1500 to Quadro 1700 in real world comparison.


    I would like to see some benchmarks with Current GeForce 280/ATI 4870 cards vs these Pro Cards under Vista DirectX10.
  • 0 Hide
    snipster4 , August 13, 2008 8:51 PM
    oh, and one more thing. Where is the Cadalyst 2008 benchmark for AutoCAD?
    It can be downloaded from link below.

    http://www.cadalyst.com/benchmark
  • 0 Hide
    yyrkoon , August 13, 2008 10:26 PM
    Quote:
    There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.


    This is not true. Improperly written applications, improper power(for whatever reason), and hardware compatibility are just three. I can probably think of more.

    Perhaps you meant when everything else is working perfectly?

    Just as an example: A few years ago when a buddy of mine built his P4 system we used a certain namebranded motherboard, coupled with an ATI 9600 pro which was factory overclocked. The motherboard had very tight tolerances for the AGP bus, and the video card drew slightly more power than the specification called for. Technically the video card should have had a plug for aux power, and someone finally came up with a mod to fix the card with these motherboards by adding one. Anyways, the end result was Windows XP would seemingly randomly lock up 1-3 times a day.

    Anyhow my point is here is that there are more than just two potential problems in a situation like this, and it surely is a PiTA to troubleshoot such problems. Making general statements like this at the very best are only half correct.

    Granted, professional grade application had better be written properly, and for $1000+ usd for a graphics card, the drivers had better well be written properly as well.

    As for the rest of your comments . . . very imformative, thanks a bunch :) 
  • 0 Hide
    sma8 , August 13, 2008 11:29 PM
    Not true I can flash the bios on my 8800 GTX and it will run just like it's workstation cousin. They are using the same hardware but handicapping the consumer card.


    BIOS flashing to a Quadro doesn't work really for the GeForce 8 series. Cause just the BIOS flash doesn't make it a Quadro. The only way to get it to work as a Quadro is by using RivaTuner. And once again, no BIOS will turn it into a Quadro, NVIDIA has made sure that is not longer possible. Yeah, they learn fast from their previous mistakes.
  • 1 Hide
    eodeo , August 14, 2008 1:15 AM
    Quote:
    I wish I would have saved my,$3,500 in Quadro cards and purchased more 8800GT or even ATI 4870X2.


    I was trying to cut down on words and still get the main message across. Thus, I failed to mention that no “professional” application can use more than singe GPU- this means no crossfire/ sli support and also no x2 card support. This doesn’t mean that you can’t use these setups with them, just that they won’t utilize the extra GPUs. So games are only going to be using all the extra power, and in truth, only the games will need it.

    Quote:
    I would like to see some benchmarks with Current GeForce 280/ATI 4870 cards vs these Pro Cards under Vista DirectX10.


    That would seam interesting- but as far as 3ds Max goes- d3d under Vista is horrible. My 8800gtx uses less than 5% of its strength in Vista. It’s so much slower that it's actually slow(!). This has to do with the fact that 1) Vista uses DX10 primarily and Max 2009 just got full dx9.0c support/ features or 2) AutoDesk just did a poor job with the Max/ Vista implementation.

    On the other hand, I’ve heard nothing but words of praise for AutoCAD under DX in Vista, so its most likely option #2 above.

    Quote:
    This is not true. Improperly written applications, improper power(for whatever reason), and hardware compatibility are just three. I can probably think of more.


    Yeah, but the thing with all of the 3 things you mentioned is going to affect both gamer and pro cards equally. That just falls into “vis major” and there’s nothing you can do about it. Debunking the belief that pro cards are inherently more stable was my only intent. Truth is, a squirrel can chew of your power line and crash your whole system, but its not going to matter if you have a quadro or a geforce inside :) 

    Quote:
    As for the rest of your comments . . . very imformative, thanks a bunch :) 


    Good to hear :) 

    Quote:
    BIOS flashing to a Quadro doesn't work really for the GeForce 8 series. Cause just the BIOS flash doesn't make it a Quadro. The only way to get it to work as a Quadro is by using RivaTuner. And once again, no BIOS will turn it into a Quadro, NVIDIA has made sure that is not longer possible. Yeah, they learn fast from their previous mistakes.


    My whole point of the above written is to show that you shouldn’t want to turn your geforce into a quadro, even if you could. No reason to do so, really. Apples to apples, pro variants are actually slower than their gaming equivalents- because manufacturers slow them down on purpose- for "stability".
  • 1 Hide
    eodeo , August 14, 2008 1:21 AM
    Quote:
    A few years ago when a buddy of mine built his P4 system we used a certain namebranded motherboard, coupled with an ATI 9600 pro which was factory overclocked.


    I find it very interesting that you chose to mention ATI 9600 Pro. I just wrote about my old ATI card recently at the official Max forums, Area- here:
    http://area.autodesk.com/index.php/forums/viewreply/78300/
Display more comments