Sign in with
Sign up | Sign in
Your question
Closed

FirePro V3900: Entry-Level Workstation Graphics

Tags:
  • Workstations
  • Graphics Cards
  • Quadro
  • Graphics
Last response: in Reviews comments
Share
March 16, 2012 4:00:03 AM

AMD's new FirePro V3900 is the company's low-profile, entry-level workstation graphics card. It's priced to compete against Nvidia’s Quadro 400. Today we're putting it up against Nvidia’s Quadro 400 and five other professional and desktop graphics cards.

FirePro V3900: Entry-Level Workstation Graphics : Read more

More about : firepro v3900 entry level workstation graphics

March 16, 2012 6:33:37 AM

If a large difference between a workstation card and a gaming card can be the drivers, does this apply to gaming performance as well? Does the workstation GPU preform similar to the desktop equiv or higher thanks to 'better' drivers. Is it just a case of drivers being optimized for things that end up not applying to gaming, thus any sort of performance increase only applies to none gaming applications?

Just curious :) 
Score
5
March 16, 2012 7:02:45 AM

I'm curious about some things, can you pop one of these cards in a pc running next to a 560Ti used for gaming? And then exchange the output in the back of the pc and select the workstation card to use to render in max or maya? Or do you have to reboot every time you set the video output?
Score
3
Related resources
March 16, 2012 8:42:53 AM

Can these cards run...THAT game? OR any other GPU intense game?
Score
5
March 16, 2012 9:02:20 AM

Both Cards run on the same Hardware, its just that Professional Video Cards have their Drivers optimized to CAD/CGI, etc.. Its like two same SUV cars with Same Engine, but with different tires, one with plain road tire and the other has snow tires.. SUV with a snow tires will certainly run better in snow terrain that the plain road tire SUV..

CAD apps like AutoCAD had Optimized code to run better on Professional Video Cards because the Optimized code in the Drivers.. Unlike Gaming Video Cards which has Optimized codes for Games but not on this CAD apps..
Score
3
March 16, 2012 9:49:54 AM

It would be nice to see one or two games thrown into the test.
Just for the heck of it, and also to answer the question:
- Which card is the better choice for my work station if I'd also like to run a game or two during the lunch break?
Score
8
March 16, 2012 10:51:30 AM

I wonder, is there going to be a new budget version out soon based on AMD 7xxx series?
Score
3
Anonymous
a b U Graphics card
March 16, 2012 10:52:49 AM

These clowns need to be brought into court for this intentionally crippling of desktop GPUs, and price fixing of workstation cards.

This travesty needs to stop.
Score
-4
Anonymous
a b U Graphics card
March 16, 2012 10:54:08 AM

Exactly. With how often the question is asked, "How well will this or that pro card perform in games?", I can't believe at least one or two game benchmarks weren't included.

I'd especially like to see some benchmarks on mid-range pro cards.

Also, same question as above, can I use a Profession CAD graphics card along side a gaming card and get CAD benefits on one monitor and gaming on the other.
Score
9
March 16, 2012 10:58:36 AM

MarriedManExactly. With how often the question is asked, "How well will this or that pro card perform in games?", I can't believe at least one or two game benchmarks weren't included.I'd especially like to see some benchmarks on mid-range pro cards. Also, same question as above, can I use a Profession CAD graphics card along side a gaming card and get CAD benefits on one monitor and gaming on the other.


Unless your motherboard supports PCI Express slot switch off via software you can't. Even if it would, you would need to restart. Plus knowing AMD driver compatibility and reliability I wouldn't even hope atm. If you are gaming a lot and doing a lot of 3D, question is, what is more important to you, games or 3D content creation? If you are just beginner and doing CAD for fun, you will get by with gaming GPU. Otherwise, you must be making money on your projects and you should afford mid-high GPU for CAD.
Score
2
a b U Graphics card
March 16, 2012 11:45:59 AM

Nice article and thank you!

Holly cow, you weren't kidding when you said 'Entry Level', this is more like 'Impoverished Level.'

To me an entry level are sub-$400 cards; nVidia Quadro 2000 series and AMD FirePro v5800. Obviously, Pro GPU's are tailored for their use.
Score
2
March 16, 2012 12:07:17 PM

This is just to make more money, I'm pretty sure they can mix both drivers (obv gna be a bigger driver then) containing both codes to optimize both gaming and CAD related stuff since both gpus use the same hardware.
Score
1
March 16, 2012 12:28:38 PM

MicrogoliathThis is just to make more money, I'm pretty sure they can mix both drivers (obv gna be a bigger driver then) containing both codes to optimize both gaming and CAD related stuff since both gpus use the same hardware.


Key word is support. Try to reach support with your 7 series GPU and then try the same when you are professional CAD user with CAD dedicated FirePro.
Score
2
March 16, 2012 12:42:56 PM

These cards are great for general office users who have multiple monitor setups.

Score
2
March 16, 2012 12:50:46 PM

This should have included some sort of Blender benchmark. Not everyone who uses a workstation is a cad designer. Some of us do modelling and rendering with Blender.
Score
3
March 16, 2012 1:25:20 PM

It’s good to hear that tom’s is going to include workstation graphics cards to its charts, hope they will include the high end ones such as the Nvidia Tesla.
Score
2
March 16, 2012 2:08:01 PM

fuzznarfThis should have included some sort of Blender benchmark. Not everyone who uses a workstation is a cad designer. Some of us do modelling and rendering with Blender.


Blender is a free tool. Hardly AMD would be spending money to optimise for freeware.
Score
-1
March 16, 2012 2:26:20 PM

edvinasmBlender is a free tool. Hardly AMD would be spending money to optimise for freeware.

its not about optimization for a free tool.. the cost of the tool isn't relevant. it is probably the most used tool in the graphical modelling/rendering world. hence a benchmark would be nice. Like i said, not everyone is build 3d engineering schematics with CAD.
Score
2
a b U Graphics card
March 16, 2012 3:09:52 PM

edvinasmBlender is a free tool. Hardly AMD would be spending money to optimise for freeware.

Blender may well be a free tool, but it is amazingly powerful and many large companies use it with their own UI and plugins for very large projects. It is used from everything from movies to video game design, and it would be very nice to see how it stacks up.
Score
3
a c 204 U Graphics card
March 16, 2012 3:09:55 PM

Joshkorn beat me to it; this is perhaps the one time where the question "but can it play ?" is relevant. I suspect, however, that a pro doing design work on a system containing one of these (or perhaps a more upstream workstation card) isn't likely to have much trouble affording an entirely separate rig nearby just for games.
Score
3
a b U Graphics card
March 16, 2012 3:18:01 PM

Interesting article, but kinda strange as well. These cards are not really made for doing design work as much as they are for managers and other non-techs to view other people's projects for review. Still, I enjoyed reading the article and would love to see followups on higher end products. I would especially love to see comparisons between gaming GPUs compared with their workstation cousins. I know many of the workstation cards are very similar hardware that is underclocked for stability, or with ECC, and simply have a different driver, while other architectures in the past have been vastly different from the more civilian cards.

Still, if you are making any amount of money doing this kind of work I am pretty sure you would be spending a minimum of $250 on your card, and likely somewhere in the $500-1000 range because it is the bottleneck of your productivity and the main determining factor on how many projects a person can do in a year.

Lastly, I would love to see how this card scales on different hardware to see how much was the $100 GPU, vs how much was due to running a duel CPU setup ;)  Something tells me that most computers this particular card would go in are very small desktops with 4GB of ram and a duel core CPU.
Score
0
March 16, 2012 4:16:19 PM

caedenvLastly, I would love to see how this card scales on different hardware to see how much was the $100 GPU, vs how much was due to running a duel CPU setup Something tells me that most computers this particular card would go in are very small desktops with 4GB of ram and a duel core CPU.


Not so much. When rendering and encoding, memory is the 2nd, sometimes the largest, protion of the budget. My current rendering rig sports 8x4 Gb, with a Quadro 400 and an old Opteron dual core. Memory is vital for things like realistic liquid rendering because many times if you are doing high quality renders, it will eat large chunks of memory and crash and fail without enough memory. Many times, before I upgraded to 32g memory I would wake up in the morning after 8 hours of crunching only to find my render worthless. If i were to get a new rig, which I am currently considering hence my initial request for blender benchmarks, I would get at least 64Gb memory, and preferably 128. 2nd biggest item will be vid card. Some things are just impossible without huge memory, regardless of processor.
Score
3
Anonymous
a b U Graphics card
March 16, 2012 5:29:44 PM

It would be nice to see in app, real world comparissons, and Cinebench 11.5 as well. I only say this because I've heard Specview makes certain Open GL calls that aren't actually common place in most of these apps but that the pro drivers support. Certain Open GL calls mind you that the graphics cards definitely don't support and slow things down considerably.

Maybe a frames per second test using a high poly scene, with high res textures, an image plane, and an animated character. I know Maya will give you a FPS heads up display, not sure about some of the other apps. If it's Maya, maybe include some Paint FX effects. It would better simulate real world results.

It's just been my experience testing gamer cards vs pro cards with real animation scenes that the actual performance difference is much closer than specview will lead you to believe. At least using Maya, I can't say about some of the other Open GL apps.
Score
1
March 16, 2012 7:48:57 PM

fuzznarfNot so much. When rendering and encoding, memory is the 2nd, sometimes the largest, protion of the budget. My current rendering rig sports 8x4 Gb, with a Quadro 400 and an old Opteron dual core. Memory is vital for things like realistic liquid rendering because many times if you are doing high quality renders, it will eat large chunks of memory and crash and fail without enough memory. Many times, before I upgraded to 32g memory I would wake up in the morning after 8 hours of crunching only to find my render worthless. If i were to get a new rig, which I am currently considering hence my initial request for blender benchmarks, I would get at least 64Gb memory, and preferably 128. 2nd biggest item will be vid card. Some things are just impossible without huge memory, regardless of processor.


My high school tried to cheap out on computers' RAM for Autodesk's 3D rendering. Needless to say, it did not work out well.
Score
2
March 17, 2012 12:32:19 AM

I've put BF2 on my laptop and run it maxed on the machines 1920 by 1200 display, fx3700m. It gave similar results to the Nvidia consumer card with the same GPU. You get about the same performance in games, but I've noticed the colors look... different. Not better or worse, just different. Often it seems like the picture is saturated, or something similar. The difference is night and day in software that is optimized for Quadro cards. A 400 dollar pro cards beats any consumer card out there by a wide margin.
Score
0
March 17, 2012 8:20:06 PM

Articles like this make sense to knowledgeable people, but they only add to the regular user's confusion! I am upgrading an older PII 3.2gb to either an I3 or I5 setup, but I don't need any gaming capabilities. So I have a new right now unopened GeForce GTX550Ti and am buying the other parts such as a Z68 board as I go. So, if I read between the lines, this new 550, which has 1024 of GDDR5 memory compared to the 1024 gddr3 on the test cards, should run as well as the workstation card or the low end quadro cards???

I should clarify, I do some CAD, some but small amounts of rendering, so I am under the impression anything I do at this time will be a significant improvement compared to my 7 year old desktop? But I mainly want to improve the general use of my computer, I'm not returning to full-time graphics use! Any input is appreciated, esecially as I am leaning toward the I3 2120 after rerading the cpu forums here.
Score
1
March 17, 2012 8:24:09 PM

Sorry, P4 3.2 g processor, don't know how to edit the original post.
Score
0
March 18, 2012 12:43:11 AM

A Bad DayMy high school tried to cheap out on computers' RAM for Autodesk's 3D rendering. Needless to say, it did not work out well.


My High school does 4 hour video rendering using adobe premiere pro and after effects on machine with 2GB of ram and x86 OS on a duel core Pentium processor.

I had thought that the days of "Out of RAM errors" were long gone but they proved otherwise.
Score
0
March 18, 2012 6:10:24 PM

djjoejoeIf a large difference between a workstation card and a gaming card can be the drivers, does this apply to gaming performance as well? Does the workstation GPU preform similar to the desktop equiv or higher thanks to 'better' drivers. Is it just a case of drivers being optimized for things that end up not applying to gaming, thus any sort of performance increase only applies to none gaming applications?Just curious

yes. that's why some people even make modded unofficial drivers that provide even more performance.
In fact a graphic card can even be crippled by the driver on porpoise by the manufacturer. For example to force consumers to buy more expensive or more profitable cards.
But the difference isn't only on the drivers, Yes the drivers is the main difference but the hardware isn't the same.
For the pricing you have to understand the business model. You get a huge support and flawless drivers. The price of the component isn't much of an issue. Just think of avatar or some computer generated movie that requires lot's of gpu power. The gpu price in the entire process is like 0.0000001% of the total budget. If those cards costed the double they would still be sold in the same quantities.
These cards are not for the average user not even for special consumers or high end consumers. They are specifically created for a professional market. In fact most pc's equipped with these cards will only run just a single program in it's entire life.
Score
0
March 18, 2012 7:17:23 PM

nice review...now let's compare some mid to top end cards :)  also do the same with gaming cards but test them against CAD designed cards...i would love to see how nVidia compares with AMD's top end cards....i am looking to build a new system with nVidia's Quadro 6000 or Multiplex 7000....thanks in advance :) 
Score
0
Anonymous
a b U Graphics card
March 18, 2012 8:23:32 PM

I agree you need tons of memory for rendering, but an old dual core opteron?? If I were doing 8 hour rendering, I would definitely upgrade the CPU class. Even a single socket Xeon 2687 would probably cut your time down to minutes. At the very least, you wouldn't have to go to sleep to figure out you ran out of memory.
Score
0
March 19, 2012 10:37:25 AM


Do not use professional cards for gaming. The driver optimisations give woeful performance
compared to proper gaming cards. Gamer cards are called as such for good reason. Likewise,
pro cards are best for pro tasks. The optimisations in the drivers are completely different, eg.
2-sided textures in gaming scenarios vs. anti-aliased lines in pro tasks.

Note though that, for Viewperf specifically, excessive CPU/compute power is not required to
obtain good benchmark scores with a particular card. I've tested a Quadro 600 with a wide
range of systems, from simple dual-core i3 up to dual XEON X5570 Dell T7500. The best results,
for the Viewperf suite that is, were obtained with an overclocked i3 550 @ 4.7GHz. See:

http://www.sgidepot.co.uk/misc/viewperf.txt

Note that the scores are better than those given for the Quadro 600 in this article.

Impressive data from the V3900 though. AMD have done a good job there.

So, if you're operating on a budget, and doing tasks that mirror what Viewperf tests (it's
not realistic for all scenarios), then you don't necessarily need big compute power (a multi-
CPU workstation is a waste in that sense), but good RAM is wise and definitely use a pro
card, not a gamer card. I've tested various pro cards for gaming tasks and the results were
not pleasant (3DMark06 mainly). See:

http://www.sgidepot.co.uk/sgi.html#PC

What Viewperf does not test are tasks that place significant demands on both CPU and GPU
resources, and RAM aswell, such as GIS, medical imaging and others involving huge datasets.
As always, test with your intended task; don't rely too much on benchmark numbers alone.


To design concepts, you'll get the best general performance from an overclocked Clarkdale,
not the i3 2120 which is fixed. Or for additional multi-core rendering speed, use an oc'd 2500K or
equivalent such as an i7 870 (2500K is cheaper now though), but an i3 550 will cost 60% less which
means additional resources can be devoted to a better GPU (the usual tradeoff to consider). I have
additional results here:

http://www.sgidepot.co.uk/misc/tests-jj.txt
http://www.sgidepot.co.uk/misc/aebench.html

Ian.

Score
1
March 19, 2012 1:11:22 PM

Dear M.Wallossek and all of Tom's fine staff

We spend a lot in "professional" GPU in our organization, but have issues finding pertinent and up to date benchmarks. This has caused much arguments over the water cooler area.

May I suggest:
- Perform the same benchmarks, nice selection here
- Add the complete lineup of current AMD and NVIDIA offering
- Add the best of the previous generation cards so we can compare from the "older" benchmarks
- Add a couple of expensive gaming GPU to see the gain/loss we get by going pro
- Add a couple of GPU intensive games just for good comparison.

Doing this about one a year with a "Best Pro Graphic card for the money" would be an instant bookmark reference for us.

Have a nice day!
Score
1
March 19, 2012 1:38:57 PM


jaylimo84, I already have some data which answers some of your questions. See my references above.

Ian.

Score
1
March 19, 2012 2:49:08 PM

The image quality is especially interesting to me, as I'd like to get into doing some 3D art (it's really got my interest lately, and looks so much cooler than the boring spool drawings I do). I assumed that just the shear power of a couple of gaming cards would make up for the lack of driver support, but the poor image quality is making me reconsider what I'm going to have to do.
Score
0
Anonymous
a b U Graphics card
March 19, 2012 4:07:23 PM

To jaylimo84 these benchmarks though don't paint us the whole picture. I'd rather see Toms take the time to implement some realword, actual performance in Maya, or similar numbers, then have them spend time running older cards through these specview benchmarks. Specially if you're going to include gaming cards in the comparison.

Spleenbegone, just know that a lot of OpenGl based pro Apps, like Maya, don't get any benefit from SLI'd or Crossfired cards. 3D Studio Max would, if run in DirectX mode though, I believe.

Score
0
March 19, 2012 5:57:55 PM


spleenbegone, good point about image quality. Years ago when I first tested a GF4 Ti4600 vs. an
Octane2 V12, the GF4 was way quicker, but its image quality was garbage compared to SGI's V12.
Similar differences still apply today. Again, this is why pro cards are best for pro tasks.

Btw, there are plenty of Quadro cards available 2nd-hand. I obtained my 3rd Quadro600 last week
for just 85 UKP, works fine (normal new price is about 140 to 150 UKP, though note there's one on
eBay UK atm for 95 BIN + 10 shipping). Also easy to obtain are FX 5500s, 4500s, 4600s, etc., though
of course newer cards like the 600 have a newer feature set even if they have less performane or
other options (no SLI with the 600).

Ian.

Score
0
March 19, 2012 5:59:13 PM


Oh, by 'similar differences still apply today', I did of course just mean between modern gamer
vs. pro cards. :) 

Ian.

Score
0
March 19, 2012 9:19:45 PM

nice review. All the questions about how does this do on gaming are hardly relavent. This card belongs in an workstation computer, in an office, used for professional stuff. There really arent that many scenarios where you would do PROFESSIONAL cad work AND serious gaming on the same workstation. But to answer the question, i'd expect the firepro/quadro to perform WORSE (or at best, equal) to their gaming cousins

I use autocad for a living and my work PC has a quadro 600 (along with 16gb ram, SSD and a single quadcore sandybridge zeon). So this review is good to see how it compares to other cards. Its also nice to see it alongside the equivalent gaming cards as many people are sceptical about the benefits of spending 4x the price on what is essentially the same hardware.

But I suppose the true test is to test it against a gaming card of the same budget, as oppose to the one of the same hardware. That could be interesting!
Score
0
Anonymous
a b U Graphics card
March 19, 2012 11:39:34 PM

Thank you for this article, very useful.
Score
0
March 20, 2012 2:35:45 AM


DavC, that's a good point about budget, and is why I mentioned the results I obtained re using
oc'd Clarkdale, which gave the best Viewperf scores. If one's task is akin to the Viewperf tests
(whichever one that may be), then spending a lot on a 4-core or 6-core CPU may be a waste,
especially XEON variants which are so expensive. A cheaper CPU like a 550 may be just as good
if not better, especially once oc'd, and the spare budget then spent on a better pro card. Have you
run Viewperf 11 on your system? If so, please compare to my results, I'd be interested to know
what you observe; my data:

http://www.sgidepot.co.uk/misc/viewperf.txt

I'll be testing with a 2700K next week.

Ian.



Score
1
March 20, 2012 4:19:28 PM

I'd like to see how these OpenGL-optimized workstation cards handle an OpenGL-based game such as Second Life. As a content creator for SL, I *do* use my PC for "both professional work and gaming".
Score
0
March 20, 2012 6:15:42 PM


Probably not very well. OGL-based games still use techniques that are not common in pro apps,
so again the driver optimisations won't be, well, optimal I suppose the word would be.

Ian.

Score
0
a c 173 U Graphics card
March 21, 2012 7:50:30 PM

Not bad at all but I would like to see some FireGL spec APUs as well. Such versions in laptops would sell like hot cakes.
Score
0
March 22, 2012 12:04:44 AM

This and all other quadro-like graphic cards are a joke- a joke that stopped being funny some 10 years ago.

Poor people that actually have to work with the likes of Catia and Pro Engineer... Personally, I blame the developers of those software for leaving it in the 20th century.

At least I can comfort myself knowing I'm a 3DS Max user and that I will not have to deal with OpenGL's hell of poor graphics and even poorer performance any time soon.
Score
-1
Anonymous
a b U Graphics card
March 22, 2012 1:05:51 AM

I have successfully run a pro GPU alongside dual gaming GPUs. In my case, I used a 8800 GTX SLI setup and added a Quadro 600 to it, on an nVidia 680i motherboard (CPU: QX6700 @ 3.73 GHz, 4x2GB DDR2 800 4-4-4-12). This unlocked the professional options, including the 10-bit display pipeline through the Quadro's DisplayPort, confirming that the drivers were more-or-less unlocked by the presence of the Quadro.

Additionally, I attempted to play Skyrim on the Dell U3011 I used to test the 10-bit capabilities (2560x1600, medium settings), and I can confirm that there is significantly more lag when playing it on the Quadro's display compared to the 8800 GTX's DVI port. Upon further research, it seems that it is possible to render on the gaming GPU and then copy the result to the professional GPU's framebuffer, but aside from this statement my experiences do not reflect this "best of both worlds" scenario.

Also, I should mention that I have since returned the Quadro 600, I used both the 276.42 Quadro and 295.73 desktop drivers successfully, and I have no further interest at this time to continue down this path. In all likelihood I won't revisit this article to answer any questions.
Score
0
March 22, 2012 11:45:03 AM

Great reading and always interested, so thanks to Tom's for doing great reviews including pro graphic hardware. Thanks also to confirm you "can't" flash a gaming card these days as easily, as I understand is the same or even impossible to flash an Nvidia pro card. In all, it is not advisable and it is not worth the effort.

I run both a Quadro 600 and a GeForce 460 on the same machine. It is working as expected in Maya and I can play games at the expected 460 fps I read on the internet, included Tom's Hardware. The Quadro600's manual even advice only to install the Professional drivers first, and if you had older pro or consumer drivers, uninstall those and then install first the new pro drivers.

Also I've tried older less demanding games in the Quadro 600 and it played well, but not very well for new demanding games.

As an example of my experience with both pro and gaming cards on the same machine I use Maya 2011 with no issues and always stable with a Dell 2410 monitor. And also I have Batman Arkham City and Lost Planet II among other games and all are running at the expected fps on the GeForce 460 on another 24 Dell.

If quality, comfort, stability in professional apps is your gold always try a pro GPU. Today there are reasonably powerful entry level pro cards to choose from.

I started to experiment on the lower end, so I could gain my knowledge at low cost, and because I started to work on gaming engines and gaming was as important as pro environments. Also because I wanted the convenience to have both environments on one machine. Ultimately, I installed a copy of W7 on a new Samsung SSD and isolated Maya on that drive to speed up things. So I can dual boot my machine depending on what I am going to do.

With all installed in the same machine I can say I am very happy and my next machine is going to include mid priced Quadros with mid priced GeForces. But next time I hope to include also a bigger SSD so I could install more apps, or maybe I will use the SSD as a cache and a regular mechanical drive.

My machine: Asus MOBO, 2600k CPU, 16 gb Corsair mem, 64 GB 830 Samsung drive, 1 TB Western Digital, Quadro 600 and GeFroce 460, Windows 7 64 pro and a lot of 3D professional and gaming software.
Score
0
March 22, 2012 12:03:23 PM

When your in making 3d models, CAD etc and your life/income depends on these workstations and GPU's then you want to have the support available which makes this worth every last penny, even though for the same price they could get a Radeon 6870 or something which is obviously going to outclass this GPU in most situations
Score
0
March 22, 2012 12:09:56 PM

SO TOMS, i have an assignment for you (or at least what i think to be a great idea)

How about matching one of these GPU's up with a new entry level Opteron 3200 series chip and see how it compares to workstation tasks compared to an Intel I3/i5/i7 and GPU of the same price point to see what Architects/3d modeling users are better off investing in? I personally dont really care about if the 3dmark score is better but like this article has emphasis on, Workstation tasks!
Score
0
      • 1 / 2
      • 2
      • Newest
!