Hi guys,I was allways wondering, how it would look with comparison of these cards with gamers cards, wheather the gamer ones did not give us as we say "Lot of music, for a little money". Whould we get simmilar results in the field of geometry transformation, without using any alliasing a any of the really not necessary features ???Thanx
I would also like to know how these cards handle games. Especially im interested in how mobile versions of quodro fx work with games. I would also like to play with my work laptop with quodro fx-chip (fx 3600m)
I was pretty much wondering how low a geforce 8800gt or ati 4750 would stack? given the lowest prices in denmark for a v3600 or fx570 start at 1150kr and a 8800gt256 costs only 680kr (4750 @ 1070kr), they're comparable in the budget department. We're only doing basic grid stuff mostly at work, but we're using intel 945 chipsets with onboard graphics for that - the pentium dualcore seems to do that okay, but it isn't happy about textures at all. However for those things we use old p4's with geforce 6600gt cards - not quadro anything, and I can't help to wonder if using gamer cards is the right or wrong choice (worked great for us so far).ps. we use inventor, mechanical desktop and revit from autodesk (newest and second newest versions only), so only the 3dstudio results are interesting for me.
Where's the Quadro FX 3700 and 4700?I don't see how you can do a review dated 13 August that seemingly cover all current main stream cards and not have 2 of NVidia's primary cards included?Was this paid for by ATI?
Hello. Thank you for the interesting article. What is also interesting is the huge gap between Spec results and my day to day experiences with ATI pro cards. When the first spec results showed up, I started recommending my clients to buy FireGL 3600-8600 cards but nfortunately, they where VERY pour performers in 3DSMAX work... apart from the fact that you couldn't even use half of their strength with the first drivers, even now, comparing the FPS count on the V5600 card with that of 9600GT on 3DSMAX shows it is much more comfortable to work with the latest.. If you want to move a large project on your viewport it moves allot faster if you have a 9600GT card installed. THIS is a benchmark I would like to see here because IT REALLY MATTERS to animators. Thank you
Wish this article had been published about three days ago, it would've made my decision on which new laptop to get for school (i'm an engineering major, so i actually will use this). I finally decided on an HP with with the FireGL v5600, looks like a made the right choice based on these benchmarks. Guess we'll see when I actually get it and try it in application.
"I would also like to know how these cards handle games. Especially im interested in how mobile versions of quodro fx work with games.I would also like to play with my work laptop with quodro fx-chip (fx 3600m)"I also have a Quadro 3600M in my laptop. In 3dMark06, I get a score of 8800. COD4 and UT3 run smoothly with all settings maxed out at 1900x1200. Quadro cards are as good or better at gaming as their Geforce equivalents.
First off, I have to say that this has been a very good review. I naturally, have several things I'd like to complain about. I mean, as a reader, its our god given right to complain and never be satisfied. Right?The good- you exposed and explained each GPU very nicely, noting everyone's gaming counterpart, included specifications for each card and commented on in with high detail.So if everything is so well and good, why do I complain?Simple- the tests. You're using the SPECheatTest. It’s well known that this test is optimized to show that even the crippled "workstation" cards outperform the much hardware superior "gaming" cards. The fact is, at least 3 programs you tested here today don't reflect the "bell" conditions posed by the SPECheatTest. Well, actually its 2. You haven't tested AutoCAD and I cant really comment on other applications as I am not familiar with them. The 2 applications I am familiar with and fully competent to speak on their behalf are 3ds Max and Maya.What do these 2 have in common other than being under the same roof now? DirectX support. ANY application supporting DX is not being crippled in drivers when running it. You will see huge leads of ATI cards over nvidia here. But I'm getting ahead of myself.In all tests you failed to mention that OpenGL is horridly slow, even on these “professional” cards. This is, of course, in the case when you can chose between OGL and DX in the same application. Not only is it slow, but its visually incomplete as it is lacking functions for visual quality display of lightning and shadow conditions that only DX 9.0c can display. I’m willing to forgive you the last one, as you maybe only tested the thing and not checked for visual quality differences. That’s not saying that SPECheatTest you used, can disply these real life conditions.So with these 2 things in mind, it’s easy to see that only idiots or people unable to use DX would use OGL instead. To be fair, you did say that nvidia’s 8800gtx... erm, I’m sorry I meant quadro fx5600 is the best OGL card- and, I agree. Nvidia uses archaic logic in their hardware and OGL fits that pattern perfectly. So, its no surprise that nvidia should win. If you have to use OpenGL for any reason, nvidia is your man... erm company.If however, you’re not stuck on appalling Mac platform or with archaic software that doesn’t support DX, it should be mentioned that ATI has significant lead. Not surprising either, since it has 3x more shaders at the same price. Games can’t use these well most of the time, but this is not the case with digital content creation (dcc) programs, like 3ds Max, Maya or AutoCAD.Speaking of games- You did mention that 8800gtx=5600fx is 2 generations old. You failed to mention that the latest gtx280 has 240 unified processors and you failed to add it to the tests. Not that I think that SPECheatTest would show that its 2x faster, but the fact of the matter – it is. For any of the 3 above mentioned programs, and likely all others that support DX (possibly OGL, but I’m not sure how crippled it is in drivers- more on this later)Which brings me to the ultra high end of dcc world- ATIs latest 48x0 cards. These have 800 unified shaders. They are just wiping the floor with all the cards mentioned in your article put together(!). All of them together (if it were possible) don’t have enough power to compete with even a single new card from ATI.You conclude that ATI is the best deal at 1000$, but you fail to notice and differentiate outdated OGL programs and new D3D. So the ATI card is the absolute undisputed winner of CURENT dcc. Crown can, in no stretch of imagination, go to nvidia- unless you mention that it’s for outdated and OGL programs exclusive- in which case it does get it. Also, I’d again like to mention that the fact that you’re using SPECheatTest, isn’t helping you build your case either. And in addition to all this, you also failed to mention that you have 10x more powerful card for less then 300$ - HD 4870. Difference is solely in drivers and than not every part of drivers, but just the OGL implementation.Which finally brings me to drivers: All the professional cards are 99% same as their gaming equivalents. They differ only in drivers. You said so, and I agree. What you failed to mention is that the professional cards are actually noticeably slower than their gaming equivalents. For stability they say. I challenge anyone to prove that gaming cards are less stable.There are 2 reasons for card instability. 1)Inappropriate cooling or 2) poorly written drivers. Slowing down the card will make it produce less heat and thus (in theory) more stable.Problem with this nice theory is that NO dcc program available today can stress any of the cards mentioned here today beyond their framebuffer capacity. Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core.Which again, brings me back to drivers- as the only other cause for instability. Here is another interesting fact you might not have known: People writing drivers for the 99% identical cards don’t do it 2x. Process may vary, but in a nutshell it’s like this: they write the drivers for the gaming card. At this point they write 2 different software types in it, either games or dcc software. The dcc software drivers go to more testing on the dcc software and in 99% of cases it’s done there and published. It will be thoroughly tested to notify that there aren’t any major bugs and than shipped.The games driver path will not end there. The driver programmer has one more duty: to cripple performance under what he deems to be “professional” software(!!!). So to reiterate this: the programmer of drivers, instead of perfecting them to be better, actually sits down and starts writing core to CRIPPLE(!?!)the gaming line of the cards. One would image that he could spend his time employing his talents elsewhere.So, crippled or not- drivers are the same 99%. If instability is brought on one line of the cards, the other isn’t spared with "superior" drivers. So in, reality, the workstation cards are no more stable than their gaming siblings, even if many would like you to believe that.In conclusion, I’d also like to nitpick and the fact that you use very low resolutions for testing- capping at 1600x1200. As you might have guessed, anyone interested in working 3d will start at that resolution minimum. Not end on it. This is not a serious overlook as you have been using the SPECheatTest to test everything, so your results should be taken with a grain of salt anyway.
Also why test AutoCAD for 3D use? Inventor is the Audodesk 3D modeling product. AutoCAD is so 10 years ago.
hi !sorry to borrow you, but as you and reader explain, cards "pro3d" and "gamer" are about the same.and since you will get only 1 per system, it will be VERY INTERESTING to tests "pro3d" and "gamer" cards with all 3D AND game benchmarks you get.because, when you bought a workstation for personnal use (i know a lot of friends doing this), you like also to relax you by playing a game !and i'm pretty sure that you will see that the top-end current "gamer" cards (with standard drivers) are much more bang-for-the buck that their old "pro3d" equivalent (with specialized drivers).an other example is Apple, dell, or HP selling you quadrofx card in systems designed for... video and multimedia works. why ? pretty sure that a gamer card will do the same job (or better) for less money.fredsky
and also a comment about "certified drivers" and "high 3d applications"1) as @eodeo wrote "Chances are that your CPU will choke waaay before any of the cards do. This is due to a simple fact, that viewports, like 95% of other things today, can utilize only single CPU core." this so a SHAME man !!! common, we got 4 vieports and plenty of cores. autodesk and others, WAKE UP, and use one core per each viewports !!!2) i was working to certify autodesk 3Dmax with nvida cards/drivers. in fact, you have to take all the drivers avaible for the quadro, and tests them (scripts-driven benchmarks). and then you will approuve or reject a driver, quality wise. so you get OLD hardware and also OLD drivers... so expensive work to got finally CRIPPLED things.fredsky
This should be tested in Vista 32bit/64bit. I know a lot of CGI artists that are using Vista with DX10. Almost all Professional apps run under Vista DirectX10 ie AutoCAD 2009, Inventor 2009.I have 5 Quadro 1500 cards running under XP 32bit and Vista 64bit. My one gaming machine with 8800GT OC card under Vista 64bit kills all my other computers running Quadro cards in AutoCAD 2009, 3DS Max 2009, Inventor 2009.Pro cards are no use anymore as they are optimized for OpenGL and most software manufactures are removing/or limiting support for OpenGL case in point Vista.I wish I would have saved my,$3,500 in Quadro cards and purchased more 8800GT or even ATI 4870X2. Note not much different from Quadro 1500 to Quadro 1700 in real world comparison.I would like to see some benchmarks with Current GeForce 280/ATI 4870 cards vs these Pro Cards under Vista DirectX10.
oh, and one more thing. Where is the Cadalyst 2008 benchmark for AutoCAD?It can be downloaded from link below.http://www.cadalyst.com/benchmark
[citation]Not true I can flash the bios on my 8800 GTX and it will run just like it's workstation cousin. They are using the same hardware but handicapping the consumer card.[/citation]BIOS flashing to a Quadro doesn't work really for the GeForce 8 series. Cause just the BIOS flash doesn't make it a Quadro. The only way to get it to work as a Quadro is by using RivaTuner. And once again, no BIOS will turn it into a Quadro, NVIDIA has made sure that is not longer possible. Yeah, they learn fast from their previous mistakes.