Sign in with
Sign up | Sign in
Your question

Workstation-Shootout: ATi FireGL V7600 vs. Nvidia Quadro FX 4600

Last response: in Graphics & Displays
Share
November 9, 2007 10:48:22 AM

The graphics card market for the workstation segment used to move at its own, more leisurely pace - until now

http://www.tomshardware.com/2007/11/09/workstation_shootout/index.html
November 9, 2007 1:42:05 PM

I am somewhat confused about how these tests were conducted. Why in the world have these tests been conducted with Maya 6.5? This iteration is years old, and subsequent releases have been heavily focused on increasing GUI/Viewport performance. When testing the latest cards available it would be smart to also test the latest available version of Maya. Another thing I don't get is why a 64bit version of Windows was not tested. Most workstations nowadays run with more memory than a 32bit OS can comfortably handle (combined with the amount of memory found on these cards and you run out of address space quickly), hence the reason that most 3D/DCC apps today have fully 64bit versions available. Maya has been available in 64bit for at least the last 2 full versions. To me the testing environment used here seems awfully shortsighted...
November 9, 2007 2:33:05 PM

I also don't totally understand the decision of using the DirectX as the only reference in your testing. A large number of the robust 3D development apps are in OpenGL and are why companies drop BIG$$ on card like the ones being reviewed in today. OpenGL needs to be a consideration when reviewing workstation cards.
Related resources
November 9, 2007 2:40:22 PM

In terms of performance, how do these cards (and probably mainly drivers) compare to their consumer level counterparts? It's an age old question that no one seems to take time to answer, but seeing how the dies are the same these days is even more valid.

Also, what are the advantages these cards have, or what will a professional CAD or graphic designer gain by spending the extra money required for one of these models?

I've looked into this in the past and have really only found a couple differences, such as I think (along the lines of) "hardware accelerated lines" which was something disabled on the chips of the consumer level cards. But a question is what benefit do "hardware accelerated lines" provide anyway?

Also occasionally there are "performance drivers" available, were any of these used in the test (I know nvidia has them for Quaddro & Max).
a b U Graphics card
a b Î Nvidia
November 9, 2007 6:10:03 PM

KwyjiboNL77 said:
Another thing I don't get is why a 64bit version of Windows was not tested.


Did they mention that it was a 32bit version of Windows? :heink: 

I see no mention of that, but the drivers for the workstation cards are the XP64 versions of Cat and Forceware, so to me that says 64bit Windows XP.

As for the versions of sotware chosen, while there is a more current version of Maya, there is no accompanying version of test software from respected groups like SPEC, their laterst version is for 6.5;
http://www.spec.org/benchmarks.html#gpc

So just like 3Dmk06 doesn't test DX10 components of DirectX for gaming systems, there is no replacement for it yet. So until SPEC comes out with a newer version, for comparison purposes everyone is pretty limited to what is available and gloabally accepted as a standard. I wouldn't trust someone's own models(s) and tests with U08, I want recreatable test, and the Maya tutorials like Werewolf and Squid are getting old (show little difference even back in the GF7/X1K generation when there were greater differences between the cards).

As for OGL, striekr there are OGL tests there , why do you say only DX?
It looks like they chose DX version of 3DSMax because it adds variety and also because it is a growing component of the 3DSMax user base. It woul be nice to compare both as some people do, but really it's a judgement call.

Anywhoo IMO, the more information the better, so I'm happy for another review, but I do agree, hey I'd love the additional information of the OGL path for 3DSMax too.
Please Sir, may I have another benchmark/test. :whistle: 
November 9, 2007 6:57:12 PM

the_computer_dud said:
In terms of performance, how do these cards (and probably mainly drivers) compare to their consumer level counterparts? It's an age old question that no one seems to take time to answer, but seeing how the dies are the same these days is even more valid.

Also, what are the advantages these cards have, or what will a professional CAD or graphic designer gain by spending the extra money required for one of these models?


Aye, I totally agree, this must be addressed by someone out there, and THG very well could have done it with this review...

I know that the consumer level cards offer absolutely fine base functionality, and they all support OpenGL 2.0 these days, so I keep telling people that they don't need workstation cards (people like my girlfriend, a studying-to-be architect).

It's an important factor.

Plus, I'm pretty sure ATI and nVidia would ship a lot more silicon if the same card worked top to bottom for both gamers, video-editors, and engineers...right now, it just seems like they're abusing the segment with supposed exclusivity only realized in select applications, which they could easily support with lesser products.

...and, of course, just one test with a game and a workstation card. Just one. Please. Can my girlfriend, who needs to run Revit well, also play the SIMS, or, something more advanced, like Portal?

It's a strange world we live in where you can buy a digital camera/mp3 player but you need one graphics card to design buildings and another to display buildings someone else designed in a video game.
November 9, 2007 7:15:35 PM

I totally agree that I'd love to see 3dmark06 for a quadro with 768mb and an 8800gtx on otherwise the same hardware/software platform.

My big theory is that the only reason companies buy quadros is mostly either elitism or ignorance rather than any real performance advantage.

My other theory is that the reason that no tech websites ever publish the results of a truly comparative performance test is political in that they don't want to be the first ones to blow nVidia and ATI's most profitable scam.

In fact I recollect some benchmarks maybe 4 or 5 years ago that showed the then-current chipset in a quadro ($thousands) and consumer video card ($hundreds) strangely showed the quadro was marginally slower at gaming. Also, back in geforce 2 days, there was some instructions floating around the internet how you could just swap 1 or 2 resistors over on a geforce 2 and turn it into a quadro (it would even identify itself as a quadro at bootup and the nVidia driver reported it was a quadro too) so there's not (or wasn't) any actual hardware difference.
November 9, 2007 8:30:55 PM

...not to mention softmods to make 9800s into Fire cards...
a b U Graphics card
a b Î Nvidia
November 9, 2007 8:47:02 PM

the_computer_dud said:

Also, what are the advantages these cards have, or what will a professional CAD or graphic designer gain by spending the extra money required for one of these models?


They gain certification compliant part, higher quality control, specialized drivers, and better customer support. The value of these varies depending on the end user, pro-sumers don't value those aspects as much as true professionals, most of whom don't even pay for these, the company does and the ar either built into cost, amortized or written off.

Quote:
Plus, I'm pretty sure ATI and nVidia would ship a lot more silicon if the same card worked top to bottom for both gamers, video-editors, and engineers...right now, it just seems like they're abusing the segment with supposed exclusivity only realized in select applications, which they could easily support with lesser products.


I doubt they'd sell anymore silicon after the initial blip, and even that would end up likely being less than 1% of their total. However if they removed the FireGL hardware requirement they'd lose alot of their Quadro/FireGL sales if they offered all of those benefits, thus reducing their resource base for paying for those added workstation optimizations, etc.

Anyone wanting many of the benifits of both while reducing the price should look towards SoftGL / SoftQuadro solutions that give you alot of the best of both worlds and you can switch back and forth pretty easy.
a b U Graphics card
November 10, 2007 12:43:15 AM

Ilander said:
...not to mention softmods to make 9800s into Fire cards...

Any you think a soft-modded 9800 will stand a chance?
November 10, 2007 2:34:52 AM

I was really looking forward to this article; in-fact when I opened up THG and saw this as the headline I literally said out loud, "ABOUT FU**ING TIME!! WHOO HOO!!" But I'm a little disappointed in this. I would've liked to know how these benefit peopel who use AutoCAd and what not. My father is a Drafter and uses AutoCAD religiously. I tried to recommend him and the owner of the company he works for a workstation card but couldn't because I had no idea what to get. Oh well...maybe next time I suppose.
November 10, 2007 1:10:44 PM

Softmodded 9800s are...well...obsolete. I only mentioned it as a reminder that in the past, we could do that. I heard that they worked fine for their time period.

I think the more important factor than a workstation card is having a very beefy processor, unfortunately. Maybe workstation cards take some of that load off, as compared to a gaming card, but I'd probably rather spend the $250 on a quad core and $100 on a graphics card with OpenGL 2.0 (and maybe squeeze the budget for 512 MB VRAM) than spend the $1000 on even the Fire card...and I don't think spending $300 on a really crappy Fire/Quadro card would be worth it at all. Ever.
November 10, 2007 2:46:34 PM

well these freaking video cards are expensive!!!
November 10, 2007 4:47:07 PM

TheGreatGrapeApe said:
Did they mention that it was a 32bit version of Windows? :heink: 

I see no mention of that, but the drivers for the workstation cards are the XP64 versions of Cat and Forceware, so to me that says 64bit Windows XP.

etc...


Page 7, test setup:

Operating System: Windows XP Service Pack 2

Since XP64 has no SP2, and no special mention is made aprt from it being XP, to me that means they haven't tested in XP64...
November 10, 2007 5:34:38 PM

I would have liked to see the V8650 vs. FX 5600 to know who holds the performance crown on the high end. Also I like it when they throw in a game benchmark for the heck of it.
November 10, 2007 7:42:47 PM

KwyjiboNL77 said:
Page 7, test setup:

Operating System: Windows XP Service Pack 2

Since XP64 has no SP2, and no special mention is made aprt from it being XP, to me that means they haven't tested in XP64...


Service Pack 2 for Windows XP Professional, x64 Edition

http://www.microsoft.com/downloads/details.aspx?familyi...

You were saying...?
a b U Graphics card
a b Î Nvidia
November 10, 2007 9:53:51 PM

What Emp said.
a b U Graphics card
a b Î Nvidia
November 10, 2007 10:10:42 PM

justinmcg67 said:
But I'm a little disappointed in this. I would've liked to know how these benefit peopel who use AutoCAd and what not. My father is a Drafter and uses AutoCAD religiously. I tried to recommend him and the owner of the company he works for a workstation card but couldn't because I had no idea what to get. Oh well...maybe next time I suppose.


It's rare you see AutoCad tests, even folks like 3Dchips and 3D proffesor who test more than most don't use AutCad much for testing.

For some people Xbit's test may have the pieces they find missing in the THG article with AutoCad and 3Dmark06 results in their test batch, but they haven't used a new FireGL just the old X1800 based 7350, which are weak compared to the refreshes (the HD2600 based 5600 beating it). If look at the AutoCAD situation it's not really stressing the cards anywhere near other 3D modeling tools.

http://www.xbitlabs.com/articles/video/display/quadro-fx5600-fx4600.html

Hope that helps, would be nice to see at least a FireGL 5600 or 7600 in there for a more recent comparison of cards.
November 11, 2007 11:42:55 PM

For those of you confused as to why you would spend so much money on a workstation card instead of a gaming card for professional applications, here are a few reasons:

1. Certified to work with your application. By that it means that you get accurate output, no driver related crashes, and solid repeatable performance.

I don't recall if this was mentioned in the article, but CAD software, more specifically, solid modelers, are nothing like games. Solid models for engineering use are exact. All extrusions, holes, rounds / fillets, are modeled and shown as such. Frame rates are not too important. In games, geometry need not be exact, as long as it looks good. Looking good will not create dimensioned drawings of real parts with tolerances, etc.

2. Autodesk AutoCAD is 2D software for drawing lines. You can argue otherwise, but that's all it really is. 3D functionality was added over time and is still terrible. It is legacy software that still has a purpose, but it is not designed for 3D solid modeling. Autodesk Inventor on the otherhand is. Inventor is very similar to Solidworks. Both are mainstream solid modelers. There are also high-end modelers like Unigraphics NX.

3. D3D is becoming more accepted in the CAD world. Inventor R11 can use either D3D or OpenGL. Inventor 2008 uses D3D exclusively. It is important to test both because not all cards work equally well in both environments and not all programs support both APIs. Previous generation ATI cards were very slow in OpenGL, but fine in D3D. nVidia cards perform equally well in both APIs.

4. Here's the major reason to get a workstation card. Working with multiple files open. A gaming card with choke if you open 5 or more part / assembly / drawing files at once. Workstation cards do not. When you design things, that is the typical environment.

5. In business time = money. Capital purchases are not so costly. I will spend a $1000 on a gfx card if it will save me even 30 mintues per day. Productivity is worth much much more that. I used to have a gaming card in my workstation when I first started at my job because nobody knew any better. I opened a few parts, and the machine essentially locked up. Losing work and then trying to recreate even simple things takes a lot more effort than a measly $1000-2000. I spent over $20,000 a few weeks ago for some mirror holders. Graphics cards, even in the price range above are cheap business purchases.

6. Again, workstation cards are not for gamers / home users. They were never inteded to be.
a b U Graphics card
November 12, 2007 12:44:31 AM

Well gaming cards are designed to run demanding apps, but only ONE at a time. Most gamers aren't throwing guys into buildings in crysis while sniping on another monitor in BF2. Plus some workstation cards are massive, so they are inherently more expensive to make. Not the massive price difference that you see, but still more expensive. Being in the smaller segment of the market forces up the price too.
November 12, 2007 2:25:30 PM

@Jimbo1234:

Sorry but nothing you said there suggest to me that a consumer/gaming version of the same GPU won't do just as well.

>> I don't recall if this was mentioned in the article, but CAD
>> software, more specifically, solid modelers, are nothing like
>> games. Solid models for engineering use are exact. All
>> extrusions, holes, rounds / fillets, are modeled and shown as such.

You're missing the point. The fact that the CAD application internally represents the objects with great detail and precision has nothing to do with the graphics engine or hardware. When a complex object comes to being displayed the engine will still use algorithms like hidden-surface removal to avoid having to render surfaces you can't actually see. In fact there's probably way more image complexity in any frame of a game like crysis than there ever is in something like the display of a car engine or whatever.

Also FYI, D3D does not calculate or show exactly correct perspective etc. (although OGL does). So if engineering labs are using D3D more now they've just blown your argument about needing precise graphics cards for CAD out the water right there.

Also for a couple of years I've worked in an office where engineers were using CAD. They NEVER take measurements directly off the screen so precise graphical accuracy is a moot point anyway.

I still maintain that if you were to benchmark a CAD workstation using a quadro then again with an exactly equivalent consumer-level card (same GPU, RAM and clocks) you would see exactly the same performance (notwithstanding any artificial crippling by the graphics driver based on detection of videocard product ID ).
November 12, 2007 3:14:49 PM

niz said:
@Jimbo1234:

Also FYI, D3D does not calculate or show exactly correct perspective etc. (although OGL does). So if engineering labs are using D3D more now they've just blown your argument about needing precise graphics cards for CAD out the water right there.

Also for a couple of years I've worked in an office where engineers were using CAD. They NEVER take measurements directly off the screen so precise graphical accuracy is a moot point anyway.

I still maintain that if you were to benchmark a CAD workstation using a quadro then again with an exactly equivalent consumer-level card (same GPU, RAM and clocks) you would see exactly the same performance (notwithstanding any artificial crippling by the graphics driver based on detection of videocard product ID ).


I have used both Pro cards and gaming cards for CAD work. Gaming cards are a waste of time. Like I mentioned before, having more than 1 part open at a time brings a gaming card to its knees.

From Autodesk's site: (http://www.inventor-certified.com/graphics//faq2.php?n=...)
"This is common with graphics cards/drivers that are not listed as having Full CAD functionality. Graphics cards/drivers designed for the CAD industry are designed to support a large number of OpenGL windows with only a very small loss in performance. Cards/drivers designed for the computer game market do not have this functionality. They may even be limited to just one hardware accelerated window."

That's the OpenGL argument. But for D3D accuracy there is still an argument as in the example below.

Measurements are made with software tools but are graphics dependant. Inventor for example calculates a chamfer dimension in a drawing from what is displayed on screen. The accuracy of the display is important.

See the response from Autodesk (http://discussion.autodesk.com/thread.jspa?messageID=52...).

If you are a gamer and need to use the occational CAD software in a tethered mode or have a student copy, a gaming card may suit your needs. But I need a workstation card at work to get things done. I do not play games at work and as I said before time is worth more money than an initial capital investment.
November 12, 2007 6:06:24 PM

Which gaming card did you try compared to which workstation card?
My guess is you're not comparing like for like.
We need actual data for gaming vs. workstation card with same amount of ram on both video cards, same GPU, same clockspeed etc.
example nVidia 8800GTX ( ~ $500) and Nvidia Quadro FX 5600 (~ $1500)
I still bet they're both the same.
November 12, 2007 7:49:08 PM

niz said:
Which gaming card did you try compared to which workstation card?
My guess is you're not comparing like for like.
We need actual data for gaming vs. workstation card with same amount of ram on both video cards, same GPU, same clockspeed etc.
example nVidia 8800GTX ( ~ $500) and Nvidia Quadro FX 5600 (~ $1500)
I still bet they're both the same.


I don't remember what gaming card it was, but the ATI FireGL V3100 was much better. I am now using an nVidia FX1500. The nVidia card is faster in OpenGL and has less driver issues in the /3GB mode in XP Pro. Soon it will be time to upgrade.

DirectX is exlusively used in Inventor 2008 under Vista, not XP. I forgot to mention that earlier. This is because there is no OpenGL support in Vista as far as I know. Supposedly DirectX 10 addresses CAD issues. But if your software is not supposrted on Vista or does not support D3D, then you really do not have much of a choice.

This is a good discussion from Autodesk on the topic and another from MCAD Online.
http://discussion.autodesk.com/thread.jspa?threadID=483...
http://www.mcadonline.com//index.php?option=com_content...

So the latest gen gaming cards (8800) on Vista for D3D enabled CAD software should work. However, with the limited development time dedicated to this so far (about 4 years), I'd stick with XP and OpenGL on a workstation card. I have had problems with D3D in Inventor 11 SP3. Once we get 2008 in the office, I'll see how things go.

It would be nice to be able to use a cheap gfx card and have a few extra gigs of memory, or a faster CPU.

On another note, CAD software is much more complex than you think. There are tolerances, material properties, physical constraints, and all detail all the time. A truck engine in Crysis is just the external detail, not individual pistons, valves, bearings, cams, rods, contraints determining position and relative motion, etc. In a CAD model everything is modeled. Assemblies are also not just engines, but entire cars with thousands of parts, that like the MCAD artile says need to hold water, not just look like they do. How about a model of a Boeing 787? The model must hold tolerances to a 0.0001 inches or better on many parts in an assembly that can be as large as a football field. So until there is data to support that there is absolutely no reason to use a workstation card versus a gaming card, guess what I will continue to use?
November 12, 2007 9:00:49 PM

Ah...so, a gaming card probably would be best for my Inventor-using (occaisionally) girlfriend when she's firing up some architectural work at home, so long as she knows not to open multiple buildings at once.

Jimbo...you've been very helpful.
November 13, 2007 8:19:52 AM

emp said:
Service Pack 2 for Windows XP Professional, x64 Edition

http://www.microsoft.com/downloads/details.aspx?familyi...

You were saying...?


Sorry, my bad... I've been on Vista since Jan 2007 (migrated from XP64), so this SP release passed me by.

But still the article doesn't specifically detail as to what version of XP was actually used. If you guys can make that out from the driver numbers, bravo, but I haven't got 15 digit drive release numbers imprinted in my memory, and I feel less than inclined to verify each driver number on ATI's website to figure out what version they were using.

As an article I think it doesn't do a very good job telling what current hardware does with current software. Even if they did use XP64, the Maya version they tested has no native 64bit support (Maya is the app I'm most interested in seeing the results for), and has not seen the big GUI performance overhaul more recent versions have had. A great deal of attention in the article is given to DX performance, but I don't think DX has (or should have) any relevance on Professional 3D apps. OpenGL is the standard and however much Microsoft and Autodesk would want to, DX should not be the focus of attention for future hardware development, as that would severly limit the choice of platform (cutting both Linux and OSX completely out of the loop).
November 13, 2007 1:57:28 PM

I will be purchasing a new PC soon and would like to get a recommendation on the graphics card I should get.

I currently have a Quadro FX 500/600, and I use Adobe Premiere and After Effects. I should also mention I have a dual monitor setup, and will have a 24" (1920 X 1200) and a 20" (1280 x 1024) with the new rig. I've been happy with the performance of the card in these applications, but I can't play any of the latest games at high graphics settings/resolutions.

Are the newest workstation cards better with the latest games, or do I even need a workstation card for these applications?

November 13, 2007 2:43:59 PM

Well, we mostly don't know how the newest WS cards work with gaming, but the gaming cards work OK with CAD-type applications, so long as you're only working on one thing at a time.

Thus, for a home-workstation, unless it's going to be your main one, I'd recommend a new "gaming" card.
November 13, 2007 2:45:51 PM

dellman said:
I will be purchasing a new PC soon and would like to get a recommendation on the graphics card I should get.

I currently have a Quadro FX 500/600, and I use Adobe Premiere and After Effects. I should also mention I have a dual monitor setup, and will have a 24" (1920 X 1200) and a 20" (1280 x 1024) with the new rig. I've been happy with the performance of the card in these applications, but I can't play any of the latest games at high graphics settings/resolutions.

Are the newest workstation cards better with the latest games, or do I even need a workstation card for these applications?


These applications are 2D editing and compositing application and don't benefit in the slightest from a dedicated Professional 3D accelerator card. It would be better to look for a card with the best hardware video playback and codec support, and if necessery Capture support if you don't have a dedicated solution for that (although most would be digital anyway nowadays, either through Firewire or USB2.0). OpenGL performance should not be you primary focus here, and I think any mid to highend consumer card will serve you just fine (just check for hardware AVC/H264 if you can).
November 13, 2007 3:35:53 PM

I appreciate the comments, but what do you make of this page from adobe.com

http://www.adobe.com/products/aftereffects/opengl.html

It appears After Effects, at least, can and does benefit from an OpenGL 2.0 card. I can see where I might get by with a lower-end card like a 550FX over a 4000FX, but what do I lose going with a "gaming" card.

Does anyone here use these applications with a non-workstation card?
a b U Graphics card
a b Î Nvidia
November 13, 2007 3:45:31 PM

KwyjiboNL77 said:
If you guys can make that out from the driver numbers, bravo, but I haven't got 15 digit drive release numbers imprinted in my memory, and I feel less than inclined to verify each driver number on ATI's website to figure out what version they were using.


Me neither;


Quote:
A great deal of attention in the article is given to DX performance, but I don't think DX has (or should have) any relevance on Professional 3D apps.


You missed out already on that discussion obviously.

Quote:
OpenGL is the standard and however much Microsoft and Autodesk would want to, DX should not be the focus of attention for future hardware development, as that would severly limit the choice of platform (cutting both Linux and OSX completely out of the loop).

As if OSX matters for this market, as for Linux it's a part, but still not the majority, not by a long shot, so for whatever arguments you have about DX being a small market, Linux and especially OSX even more so.

Quote:
Quote:
I currently have a Quadro FX 500/600, and I use Adobe Premiere and After Effects.


These applications are 2D editing and compositing application and don't benefit in the slightest from a dedicated Professional 3D accelerator card. It would be better to look for a card with the best hardware video playback and codec support,


Excuse me? He's not asking about just a random NLE here or Premiere alone, he's asking about After Effects, which like a few others DOES use graphics power for 3D modeling, and in a similar fashion as the other examples we use here.

Quote:
and if necessery Capture support if you don't have a dedicated solution for that (although most would be digital anyway nowadays, either through Firewire or USB2.0). OpenGL performance should not be you primary focus here, and I think any mid to highend consumer card will serve you just fine (just check for hardware AVC/H264 if you can).


It doesn't sound like he has trouble with importing the material, so a capture card doesn't solve the problem he described which is balancing the workstation side with the gaming side.
[/quotemsg]
a b U Graphics card
a b Î Nvidia
November 13, 2007 3:57:42 PM

dellman said:

It appears After Effects, at least, can and does benefit from an OpenGL 2.0 card. I can see where I might get by with a lower-end card like a 550FX over a 4000FX, but what do I lose going with a "gaming" card.

Does anyone here use these applications with a non-workstation card?


I haven't looked into After Effects with the new batch of 2K cards, but it stands to reason that an HD3850-3870 wil give you a great balance of gaming performance as well as good support for After Effects. The only thing is that you may have to wait for driver support initially. The GF8800GT is likely about as equally attractive, although with slightly higher power/heat concerns.

I would suggest waiting to see what happens when the HD38xx series comes out and then getting some feedback from the Adobe forums from people who've tried them and then go with whichever gets the most positive feedback. Both should be up to the task as even Adobe's own information mentions support for the gaming cards right out of the box. Just expect that with any new hardware there might be growing pains.

If you don't care about power/heat/cost concerns then a GF8800GTX or GTS-640 would be a pretty solid 'well known' option, just not as attractive IMO as the GF8800GT or HD38xx series cards, however you would have to wait for those it looks like (GT supply low, HD3850/3870 launch in 2 days).
November 14, 2007 8:53:04 AM

TheGreatGrapeApe said:
Me neither;
]http://img215.imageshack.us/img215/8499/googleyk0.gif

Quote:
A great deal of attention in the article is given to DX performance, but I don't think DX has (or should have) any relevance on Professional 3D apps.


You missed out already on that discussion obviously.

Quote:
OpenGL is the standard and however much Microsoft and Autodesk would want to, DX should not be the focus of attention for future hardware development, as that would severly limit the choice of platform (cutting both Linux and OSX completely out of the loop).
Quote:


As if OSX matters for this market, as for Linux it's a part, but still not the majority, not by a long shot, so for whatever arguments you have about DX being a small market, Linux and especially OSX even more so.

Quote:
Quote:
I currently have a Quadro FX 500/600, and I use Adobe Premiere and After Effects.


These applications are 2D editing and compositing application and don't benefit in the slightest from a dedicated Professional 3D accelerator card. It would be better to look for a card with the best hardware video playback and codec support,


Excuse me? He's not asking about just a random NLE here or Premiere alone, he's asking about After Effects, which like a few others DOES use graphics power for 3D modeling, and in a similar fashion as the other examples we use here.

Quote:
and if necessery Capture support if you don't have a dedicated solution for that (although most would be digital anyway nowadays, either through Firewire or USB2.0). OpenGL performance should not be you primary focus here, and I think any mid to highend consumer card will serve you just fine (just check for hardware AVC/H264 if you can).


It doesn't sound like he has trouble with importing the material, so a capture card doesn't solve the problem he described which is balancing the workstation side with the gaming side.

Sorry, buddy but you seem somehow offended by my comments so you feel the need to have to debunk them point for point. Please read my comment again before you try to 'debunk' everything I said. I said a DEDICATED PROFESSIONAL 3D ACCELERATOR is not needed for these programs; any Mid-High end consumer card offers excellent OpenGL 2.0 support for these apps. Why pay 500 bucks or more for a watered down quaddro, when you can get a 8800 or ATI equivalent with more OpenGL performance, just as good or better 2D support and the perfect gaming support?

You are talking about Linux and OSX being in the minority here, but you miss the point. Yes all these cards have excellent DX performance, but in production environments OSX and Linux are a lot more common than you might think and nixing support for these platforms in favor of DX (only to serve the 3D Max crowd, it seems) would be disastrous. I personally think DX performance of these card is more a by-product of their close relationship to the gaming equivalent chip, not a sure sign the market is moving toward DX (again, that would be a big mistake IMO). Ditching the Open Source, Multiplatform OpenGL standard for a Windows-Only, MS owned proprietary standard is not healthy for the industry.

As for the capture remark, Delmann is obviously looking for the best compromise between work and gaming. It would be stupid not at least to touch on capturing, unless he has that sorted (like I said in my original comment; if you don't have a dedicated solution for that (although most would be digital anyway nowadays, either through Firewire or USB2.0)). The fact that Dellman didn't ask about it, doesn't mean that capture support couldn't be beneficial in his quest for finding a good 'compromise' card.

And I am deeply sorry if I didn't have the wherewithall to google driver release numbers to figure out information that should have been in the article to begin with (XP64 is only 4 characters for chists' sake!).
[/quotemsg]
a b U Graphics card
a b Î Nvidia
November 15, 2007 1:33:49 AM

Quote:
Sorry, buddy but you seem somehow offended by my comments so you feel the need to have to debunk them point for point.


Not offended, it just makes it easier when people reply to points instead of generalities, especially when clarifying things like people's assumption about the OS. I quote your points to make it clear what I'm replying to, and I clarify statements to make sure the information someone grabs is as good as we can make it for future reference if someone searches for this or looks in the thread later.

Quote:
Please read my comment again before you try to 'debunk' everything I said. I said a DEDICATED PROFESSIONAL 3D ACCELERATOR is not needed for these programs;


I did READ your comment, Did you?
The above is NOT what you said, and since you accuse me of not reading it I'll type it back to you for you to re-read.

"These applications are 2D editing and compositing application and don't benefit in the slightest from a dedicated Professional 3D accelerator card."

That's not the same as "A workstation card is not required" that's saying they don't benifit at all because they are 2D. However, they do benifit, maybe not more than a gaming card, but that's not what you said. Put an FX500 versus an intel Extreme Graphics IGP, do you think it would benifit or not?

I don't say he requires a $500+ card and in fact my recommendation obviously shows that's not the direction I'm pointing him in, but you say it won't benifit at all which is just wrong.

Quote:
Ditching the Open Source, Multiplatform OpenGL standard for a Windows-Only, MS owned proprietary standard is not healthy for the industry.


That's irrelevant. This isn't a DX vs OGL / M$ versus OpenSauce / the future of apps - article, it's a workstation graphics card article, and DX tests have every business being there as OGL ones. Like I said I would've prefered more tests including the use of both DX and OGL 3DSMax but that's up to the author, and hopefully for a future article.
You're more concerned about your niche / OS-battle than getting the most/more information.
It's not that there is no OGL or not enough, it's that there is any DX at all, as you so clearly say "but I don't think DX has (or should have) any relevance on Professional 3D apps" , regardless of the fact that it is used, and more extensively now than before. You seem to think aticles should have a pre-determined slant to promote one thing or another, not just test what they can and provide that information. Give me more information not less.

I wouldn't want an all OGL article anymore than an all DX article, because the rigs I have to concern myself with at work run BOTH, and I want more information regardless of which banner is waving behind the apps/API.

Quote:
As for the capture remark, Delmann is obviously looking for the best compromise between work and gaming.


Which actually has nothing to do with a capture card.
He talked about gaming in addition to the Quadro he already has, and mentioned he was happy with their performance in the work apps, it was the gaming component he was asking about. What capture mechanism does he lose by going from a Quadro FX series card to a gaming card that requires the mention of a capture card? If anything he gains unneeded capture features on many of the gaming options. Once again you're misreading what's written and replying to something else.

Quote:
And I am deeply sorry if I didn't have the wherewithall to google driver release numbers to figure out information that should have been in the article to begin with (XP64 is only 4 characters for chists' sake!).


I agree it should be in the article, but instead of asking for clarification or looking for it yourself you criticized the author/article for using 32bit OS when it looks like they didn't. So despite your sarcastic apology, perhaps in the future it would be best if instead of hiting 'reply' with a kneejerk response you try google first.
November 15, 2007 1:40:39 PM

WOW.....sorry to start such a vitriolic discussion. First, let me clear up a couple of things:

I am using a MiniDV camera, so capture capability on the card is not important.

Some of the projects I work on in After Effects are in 3D space, but most are 2D.

I did say I was happy with the performance of my current rig, but I should note that I'm not working in HD yet. I would like to upgrade my camera in the near future and want to be able to edit without problems. Also, I can't say I'm really happy with the time it takes to transcode my finished video to mpg for DVD burning. I would love for this to be faster.

I kind of cleared up some of my own confusion by reading the OpenGL list on Adobe's website a little more closely. It clearly lists the GeForce Series 8 cards as being acceptable. I had convinced myself that the gaming cards did not have any OpenGL capability.....my mistake.

Now, a few more questions

I am currently using Vista, and will be with the new rig, am I screwed already when it comes to playing games?

AMD has released their new mid-range card. Anybody care to comment on it vs. 8800GT, as they pertain to my needs.

Despite all the comments, which I appreciate, no one has addressed the issue of using the current crop of workstation cards to play games. How do they perform?
February 24, 2008 11:23:36 PM

Jimbo1234 said:
For those of you confused as to why you would spend so much money on a workstation card instead of a gaming card for professional applications, here are a few reasons:

1. Certified to work with your application. By that it means that you get accurate output, no driver related crashes, and solid repeatable performance.

I don't recall if this was mentioned in the article, but CAD software, more specifically, solid modelers, are nothing like games. Solid models for engineering use are exact. All extrusions, holes, rounds / fillets, are modeled and shown as such. Frame rates are not too important. In games, geometry need not be exact, as long as it looks good. Looking good will not create dimensioned drawings of real parts with tolerances, etc.

2. Autodesk AutoCAD is 2D software for drawing lines. You can argue otherwise, but that's all it really is. 3D functionality was added over time and is still terrible. It is legacy software that still has a purpose, but it is not designed for 3D solid modeling. Autodesk Inventor on the otherhand is. Inventor is very similar to Solidworks. Both are mainstream solid modelers. There are also high-end modelers like Unigraphics NX.

3. D3D is becoming more accepted in the CAD world. Inventor R11 can use either D3D or OpenGL. Inventor 2008 uses D3D exclusively. It is important to test both because not all cards work equally well in both environments and not all programs support both APIs. Previous generation ATI cards were very slow in OpenGL, but fine in D3D. nVidia cards perform equally well in both APIs.

4. Here's the major reason to get a workstation card. Working with multiple files open. A gaming card with choke if you open 5 or more part / assembly / drawing files at once. Workstation cards do not. When you design things, that is the typical environment.

5. In business time = money. Capital purchases are not so costly. I will spend a $1000 on a gfx card if it will save me even 30 mintues per day. Productivity is worth much much more that. I used to have a gaming card in my workstation when I first started at my job because nobody knew any better. I opened a few parts, and the machine essentially locked up. Losing work and then trying to recreate even simple things takes a lot more effort than a measly $1000-2000. I spent over $20,000 a few weeks ago for some mirror holders. Graphics cards, even in the price range above are cheap business purchases.

6. Again, workstation cards are not for gamers / home users. They were never inteded to be.


I have to agree and disagree with you here....

I understand what you're saying but you're missing two major fields of throught here with respect to autocad and gamer & professional cards...

1. Autocad is widely used for getting models started, such as large building's, landscape's million dollar homes etc, some firms don't have the money to invest in a new peice of software everytime a different project comes along so autocad has definitely taken on a "jack of all trades" feeling. The modeling isn't fantastic, but it still demands alot from graphics cards. I've used machine's that had trouble keeping a basic colour solid over some basic W shape steel framing, and on other high powered machine's i've had gutters showing thru 4 walls in a very large residential project when orbiting around the view. Further more autocad is a gateway to other programs like 3ds max and viz where autocad user's in a company can creat a model with the software they're more productive on then use max or viz to import the model, clean things up and put on some good materials that don't look like fisher price brand (autocad's downfall).
So we need to see benchmark's for practical everyday working with autocad 3D models.

2. Consumer vs Professional cards. Saying "professional cards are better" is like saying apples are better then oranges. Where do you draw the line? Yes, if you're working with more then 5 drawings. Awsome. But what if you aren't? Where do you draw the line?
Where does using the 8800GT, or GTX start to be become less economical then buying the next best professional 3d card and what is the cost difference. Even better a question is, what's the performance difference? Again keeping in mind if you're only using say 1-3 models at a time?
Is the 8800GT comparable to the FX1500 or the FX3500 ??? is the 8800GTX comparable to the FX4500 or the 5500????? It's practically all the same hardware so we need benchmarks to answer these questions and until you get a good selection of both cards on the same run of test's you'll never know.

We all want to see the numbers run on this for the same reason, money. Where does the high end of the consumer gaming card bleed into the professional series of cards for the same set of applications.

Right now all you have to run on is your gut feeling, and the hope that nvidia and ati aren't robbing small/medium size companies for the sake of a driver and networked support from the graphics company and the software company.

As consumers (aka small business owners) you're even more screwed. It's like being led along blindfolded and told to buy the most expensive card you can afford for modeling. If it doesn't do what you want, wether a gaming card or professional card, the explanation you always get is, spend more money next time.

Somebody please step up to the plate and compare consumer and professional cards side by side. Let the user's decide what's best for OUR needs.
Derek.
a b U Graphics card
February 25, 2008 12:14:45 AM

Ilander said:
Aye, I totally agree, this must be addressed by someone out there, and THG very well could have done it with this review...

I know that the consumer level cards offer absolutely fine base functionality, and they all support OpenGL 2.0 these days, so I keep telling people that they don't need workstation cards (people like my girlfriend, a studying-to-be architect).


It's an important factor.

Plus, I'm pretty sure ATI and nVidia would ship a lot more silicon if the same card worked top to bottom for both gamers, video-editors, and engineers...right now, it just seems like they're abusing the segment with supposed exclusivity only realized in select applications, which they could easily support with lesser products.

...and, of course, just one test with a game and a workstation card. Just one. Please. Can my girlfriend, who needs to run Revit well, also play the SIMS, or, something more advanced, like Portal?

It's a strange world we live in where you can buy a digital camera/mp3 player but you need one graphics card to design buildings and another to display buildings someone else designed in a video game.

I have built PCs that were aimed solely at running Inventor, SolidWorks,etc. There is virtually NO performance difference that you can see/feel between Workstation vs. Normal graphics (non integrated, ie 8800GT,etc). I have however noticed that more video RAM and System RAM there is the better it will be.
August 31, 2008 2:02:24 AM

Ok time for more proof...

If nobody was watching, since version 2007 AutoCAD completely changed the 3D engine to a newer model that Autodesk was using in the 3D studio software (check out the materials editor, they are very similar), it can even support the 3D Helix, Sweep, Loft and other nasty get ready to crash your PC 3D modeling tools.

Using Catalyst magazine's C2008 benchtest program for AutoCAD ver 2009 (32 and 64 bit) on windows Vista (32 and 64 bit installs). I have been testing 3 cards (ATI FireGL 7600 512mb; ATI Radeon HD 4850 512MB; NVidia Quadro FX 1700 512mb came with tower @ base + $400) on a Dell quad core Xeon purpose built workstation (t7400 series) with 4 gb and 8 gbs of memory. I found that there are definitely performance issues between 32 bit and 64bit setups. Here's the lowdown:

First of all consider the future. Windows Vista did not support OpenGL until later, so AutoDesk didn't either. Leaving the only hardware performace option being Direct3D. Bummer for all us Open GL fans of workstation class cards; perhaps another casualty of the 64bit evolution.

32 bit and 64 bit perform the same in 2D, disk, & CPU index. In fact, all the possible combinations above perform the same, more or less. So for them, the PC made all the difference not the GPU. Now lets talk 3D...

On average 64 bit performs 70min vs 85min faster than 32 bit. Thats about 20% faster test time just because of 3D in 64 bit. So all tests worth looking at are to be peformed in 64 bit. Lets talk cards...

The Radeon HD 4850 ($229 Fry's in store) vs the FireGL 7600 ($699 upgrade cost to the tower) performed with the EXACT TOTAL SAME INDEXES. But there were 4 levels of the test, gradually harder. The first 3 levels the Radeon slightly out performed the FireGL, the 4'th and hardest phase the FireGL held a consistatly powerful performace where the Radeon dropped to 1/3rd its normal strength. That's the only argument for Workstation class cards that's worth listening too. Now for the curveball...

Nvidia's Quadro FX 7600 perfomed significantly slower than the 2 ATI cards (as if it were running in 32 bit mode) as expected for its generation placement. Then NVidia created a special driver just for AutoCAD (HDI driver, that AutoCAD has to load on its own). This driver surprised me. It performed 2 times FASTER than the 2 ATI cards using the Quadro FX 1700!!! (still in 4 gb setup).

So does more memory mean more power? I plugged in 4 more gb from another Dell T7400. My Windows Vista performance rating for memory went from 5.1 to 5.2. Not much there. The bench test's all performed the same. So Inspite the massive array's created by the test and the time spent to display, save, add, remove, change properties, etc... The extra memory didn't add anything. Which brings me to 2 possible conclusions: the 4gb was MORE than enough for the test, or nothing knows how to actually use the other 4gbs added (a dissappointment of the 64bit extra memory hype).

As for me and my mass computer purchases comming up... My company is moving from 2D uses of AutoCAD to 3D uses of AutoCAD and Revit. If I can get past the next few years of PC evolution using a cheaper card then I will. I can save $600 easy going gaming class ATI Radeon because of the Direct3D issue, but I would stick to the underclocked version for its added stability (just get MORE MEMORY on the video card for LESS MONEY to increase performance). However, with the SLAPpppp in ATI's face from the NVidia Performance driver written directly for AutoCAD, I am just as willing to get a 'lower end' Qudro FX to save the money and get the 2x performance in the few programs we use.

Time for this industry to test the workstation cards side by side with the gamming cards...

jvj
!