Sign in with
Sign up | Sign in
Your question
Solved

Help Choosing a New Graphic Card for Maya + Photoshop + Gaming

Last response: in Graphics & Displays
Share
April 6, 2012 4:02:39 PM

Hey guys,

A bit of a dilemma here. Trying to get a new build happening in the next 1-3 months and I'm deciding on the best graphic card suitable for my needs.

I am a professional freelance digital artist (work from home) and I mainly use Photoshop, Maya/Zbrush/Realflow, and sometimes gaming.

That being said, I'd say my priorities are 40% photoshop, 40% CAD and 20% Gaming. The most intensive gaming I'd be doing is playing Diablo 3, so it's really nothing too crazy.

Since I use Photoshop a lot, ATI is out of the question as far as CUDA goes. I am thinking a pro card is the best option here...I am stuck between a Quadro 4000 or the newest GeForces. I've heard mixed reviews about both...I want to believe a Quadro 6000 is worth the money, but I don't think I do enough hardcore work to require 6gbs of memory.

What are your thoughts on this matter? Are there any new cards coming out with Nvidia this year that may be worth waiting for?

a b U Graphics card
April 6, 2012 4:11:14 PM

My thought are to get GTx 680 in Dual/Triple/Quad SLI rather than Quadro 6000 since they are old based on Fermi and not Kepler...
m
0
l
Related resources
a b U Graphics card
a b 4 Gaming
April 6, 2012 4:20:14 PM

The Quadro 4000 is likely the best match for what you are describing (granted I have not had any personal experience with one myself).

If you are looking for a cheaper mainstream card that would work then I would stick with the 570/580. The 680 (for whatever reason) is getting bad reviews for productivity work (though absolutely mindblowing for gaming). I have not seen any specific potoshop/Premiere benchmarks yet, only synthetic ones, but in every single one the 570 beat the 680 to a bloody pulp when it comes to rendering work. But as you have time there may be some more 'real world' type benchmarks to see if these synthetic benchmarks translate into real use, or if things are different before you make a purchase.

Still, as 80% of your workload is professional work then the Quadro is the direction you should be looking in. With the release of the 600 series there will likely be new Quadro cards being released 'soon' but I have no idea when, and I have no idea what the price would be, or how it would affect the cost of the current cards if the new ones are in the $2000+ range.
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 4:21:28 PM

legendkiller said:
My thought are to get GTx 680 in Dual/Triple/Quad SLI rather than Quadro 6000 since they are old based on Fermi and not Kepler...

Most professional software does not take advantage of SLi, so the best single card that can be afforded is the best option.
m
0
l
a b U Graphics card
April 6, 2012 4:28:04 PM

realmisr said:
^Really? I've read that the GTX680 is really crippled for OpenGL and OpenCL...

http://www.tomshardware.com/reviews/geforce-gtx-680-rev...

If you wont get the $500 card than get GTx 580 3GB Quad SLI $2000
http://www.compsource.com/ttechnote.asp?part_no=03GP315...
best and cheapest one I could find and plus you can OC further... You can get the 1.5GB version for $80 cheaper but i'll warn you, the bigger the monitor's resolution, the more VRAM it's ganna take and the more lag will occur IF they dont fit in the 1.5GB...
m
0
l

Best solution

a b U Graphics card
a b 4 Gaming
April 6, 2012 4:28:23 PM

Why is it that the Quadro series is $750 for the 2GB/256 cores, and then $1750 for the 2.5GB/352 cores, and then it jumps up to $4000 for 6GB/448 cores. Seems like there is some room in there for some cards in between. A 4GB/352core model would be a great match for a lot of professionals I think.
Hopefully the new AMD cards will help drop the price a bit over time now that they finally have some products that are better than the trash they have been putting out in the past.
Share
April 6, 2012 4:29:06 PM

Thank you Caedenv, your input is very true and valid. I think I may just wait for the new cards to come out (kepler?)
m
0
l
April 6, 2012 4:35:53 PM

legendkiller said:
If you wont get the $500 card than get GTx 580 3GB Quad SLI $2000
http://www.compsource.com/ttechnote.asp?part_no=03GP315...
best and cheapest one I could find and plus you can OC further... You can get the 1.5GB version for $80 cheaper but i'll warn you, the bigger the monitor's resolution, the more VRAM it's ganna take and the more lag will occur IF they dont fit in the 1.5GB...


That's great and all, but again like Caedenv said...Maya and most other professional CAD software do not use SLI, or shall I say, you can try but it causes a lot of problems and is not recommended by the software.
m
0
l
April 6, 2012 4:36:55 PM

CaedenV said:
Why is it that the Quadro series is $750 for the 2GB/256 cores, and then $1750 for the 2.5GB/352 cores, and then it jumps up to $4000 for 6GB/448 cores. Seems like there is some room in there for some cards in between. A 4GB/352core model would be a great match for a lot of professionals I think.
Hopefully the new AMD cards will help drop the price a bit over time now that they finally have some products that are better than the trash they have been putting out in the past.


I was just thinking about this...the leap between 4000 and 5000 is the most ridiculous. The 6000 is also ridiculous in terms of price, but if it was priced like a 5000, then I'd hop on it REAL fast :p 
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 4:44:05 PM

realmisr said:
Thank you Caedenv, your input is very true and valid. I think I may just wait for the new cards to come out (kepler?)

Yes, they will be Kepler based, I think GK114 is the base chip, but I am not sure on that. If they have a performance gain like they had from the 580->680 then we are talking some serious performance increases over the current crop of Firmi based Quadros. But I have no idea if it will take the $4000 spot, which would force the rest of the card prices down, or if it will simply just be more expensive. The 'problem' with the professional market is that they are more interested in downtime due to failures, and rendering time than they are interested in costs. The market for the Quadro 6000 is mostly Hollywood and big CAD houses where they will make back the hardware costs in a week, so they don't really care if the hardware cost is $4000 or $10000 so long as it lets them take on more projects every month.
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 4:48:21 PM

realmisr said:
That's great and all, but again like Caedenv said...Maya and most other professional CAD software do not use SLI, or shall I say, you can try but it causes a lot of problems and is not recommended by the software.

CS6 may start having a few titles that will start using SLi a little more effectively, but I think we are still several years out before we start seeing universal support that scales decently.

Stability issues are not something you want to add to your system if it can be avoided.
m
0
l
April 6, 2012 4:49:02 PM

CaedenV said:
Yes, they will be Kepler based, I think GK114 is the base chip, but I am not sure on that. If they have a performance gain like they had from the 580->680 then we are talking some serious performance increases over the current crop of Firmi based Quadros. But I have no idea if it will take the $4000 spot, which would force the rest of the card prices down, or if it will simply just be more expensive. The 'problem' with the professional market is that they are more interested in downtime due to failures, and rendering time than they are interested in costs. The market for the Quadro 6000 is mostly Hollywood and big CAD houses where they will make back the hardware costs in a week, so they don't really care if the hardware cost is $4000 or $10000 so long as it lets them take on more projects every month.


Yea that's true. Which is why I can't justify paying that much...I think the most I'd pay for a pro card would be 1.2-1.5k. May need to grab one off Ebay when it happens :na:  I do have CAD in 40% of my work, but the work I need to create/render doesn't consist of insanely complex scenes with dynamics and animations.
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 4:57:22 PM

realmisr said:
I was just thinking about this...the leap between 4000 and 5000 is the most ridiculous. The 6000 is also ridiculous in terms of price, but if it was priced like a 5000, then I'd hop on it REAL fast :p 

Seriously, if the 5000 had 4GB of ram on it and was ~$2000 then it could be worth it. But if you need that much GPU power then you likely need more than 2.5GB of ram, and are going to have bottleneck issues moving stuff from the system Ram to the vRam. Even 3GB of ram would make the price increase more understandable, but for $30 of ram and a 1/3rd speed boost it is hard to justify a 2.3x price increase. As it sits, the only real options are the 4000 and the 6000.


Out of curiosity, what does the rest of your rig look like? Are you running LGA2011? Duel/Quad 2011 Xeons? or the previous gen Xeons? Or are you doing something on the AMD side (something I am woefully ignorant of at the moment)?
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 5:02:19 PM

realmisr said:
Yea that's true. Which is why I can't justify paying that much...I think the most I'd pay for a pro card would be 1.2-1.5k. May need to grab one off Ebay when it happens :na:  I do have CAD in 40% of my work, but the work I need to create/render doesn't consist of insanely complex scenes with dynamics and animations.

lol, eBay is a scarry place sometimes, but other times you can find some great steals. When I was just getting back into video editing again ~3 years ago I found an unopened Adobe CS1 Video Production suite for ~$150. It was dirt cheap, and let me get my feet wet again, while letting me take advantage of the upgrade discount for CS5.5 when I needed to upgrade for better HD support when I upgraded my camera. Granted, there are plenty of production houses going out of business that will sell good equipment for cheap :)  If you can find them
m
0
l
April 6, 2012 5:19:56 PM

CaedenV said:
Seriously, if the 5000 had 4GB of ram on it and was ~$2000 then it could be worth it. But if you need that much GPU power then you likely need more than 2.5GB of ram, and are going to have bottleneck issues moving stuff from the system Ram to the vRam. Even 3GB of ram would make the price increase more understandable, but for $30 of ram and a 1/3rd speed boost it is hard to justify a 2.3x price increase. As it sits, the only real options are the 4000 and the 6000.


Out of curiosity, what does the rest of your rig look like? Are you running LGA2011? Duel/Quad 2011 Xeons? or the previous gen Xeons? Or are you doing something on the AMD side (something I am woefully ignorant of at the moment)?


I'm still open to suggestions as far as a new rig goes. Maybe you could give me some input on that. I want to run LGA2011 as it seems the most promising, and maybe even dual LGA2011 with that new EVGA X79 motherboard...but my concern lies in cooling 2 cpus. I want to do liquid cooling, but it was already a pain in the ass to set that up with my current rig (1 cpu) from 4 years ago. Also I'd have pretty much no option to swap out for new gear unless I take all the watercooling stuff out and reinstall...which I would never want to go through...though maybe that's easier now with the new towers having that easy CPU swap cutout?

I think my main goal is just to have a powerful processor/s with loads of cores to work with for rendering, and I want to put 64gbs of ram in there as well. And ofcourse, that Quadro when it comes out.
m
0
l
April 6, 2012 5:24:30 PM

CaedenV said:
lol, eBay is a scarry place sometimes, but other times you can find some great steals. When I was just getting back into video editing again ~3 years ago I found an unopened Adobe CS1 Video Production suite for ~$150. It was dirt cheap, and let me get my feet wet again, while letting me take advantage of the upgrade discount for CS5.5 when I needed to upgrade for better HD support when I upgraded my camera. Granted, there are plenty of production houses going out of business that will sell good equipment for cheap :)  If you can find them


Yea you really have to know what you're doing with Ebay...hit or miss honestly.

m
0
l
April 6, 2012 5:24:53 PM

buy quadro 4000(make this primary) and geforce 460, (sli them or whatever), because if you want GPU rendering - this is solution.

quadros are really good for Open GL(Maya uses OpenGL, but recheck if it can use direct3d; for example 3ds Max uses Direct3d, so there's very small point of buying workstation card in this scenario), when geforces are much cheaper and good enough for photoshop(due to their CUDAs). But, as I know, CS6 suite will use OpenCL technology, so you can buy AMD's card if you want.

But, I'm sure quadro 4000 will be enough for you.

then, there's huge plus on drivers side - you will forget [Ctrl] + keyboard combination(or whatever maya uses for saving).

you actually can ask same question on area.autodesk.com (go to forums and check Maya section)
m
0
l
April 6, 2012 5:28:32 PM

^How exactly does that work? Excuse my ignorance..I have never owned 2 graphic cards in 1 rig. If I connect a quadro to my main monitor, how would I use the Geforce, say for gaming?

In all honesty, I think even a quadro 4000 is plenty for diablo 3, which is probably the most graphic intensive game i plan to play (for now)
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 5:45:14 PM

janiashvili said:
buy quadro 4000(make this primary) and geforce 460, (sli them or whatever), because if you want GPU rendering - this is solution.

quadros are really good for Open GL(Maya uses OpenGL, but recheck if it can use direct3d; for example 3ds Max uses Direct3d, so there's very small point of buying workstation card in this scenario), when geforces are much cheaper and good enough for photoshop(due to their CUDAs). But, as I know, CS6 suite will use OpenCL technology, so you can buy AMD's card if you want.

But, I'm sure quadro 4000 will be enough for you.

then, there's huge plus on drivers side - you will forget [Ctrl] + keyboard combination(or whatever maya uses for saving).

you actually can ask same question on area.autodesk.com (go to forums and check Maya section)

I didn't realize that the new Adobe suite supported the AMD cards... that opens up a lot of possibilities as the new 7000 series is pretty sweet for rendering work (and I say that as a stout nVidia fanboy... the new AMD cards really rock if you have the software support!). Something to definitely look into if the bulk of the rest of your software will work with it.
m
0
l
April 6, 2012 5:56:08 PM

realmisr said:
Hey guys,

I am a professional freelance digital artist (work from home) and I mainly use Photoshop, Maya/Zbrush/Realflow, and sometimes gaming.

That being said, I'd say my priorities are 40% photoshop, 40% CAD and 20% Gaming. The most intensive gaming I'd be doing is playing Diablo 3, so it's really nothing too crazy.



Two machines.

Machine 1: Professional Build
Spec out your professional workstation... pour your hard earned cash into it because there is a return on investment. Quadro 5/4000 seems fine, consider a V7900 and xFire down the road if you need. If you build your own... then look at Xeon processors and motherboards.... load that baby up with ECC RAM. ECC will save you time with fewer crashes when you are running long builds or renders.

You may also choose to go the consumer i7 route... LGA2011 boards often support up to 8 DIMMS, and you can OC the heck out of the RAM and the processor. You'll probably get a better bang for your buck going this route... but no ECC RAM.

Pick your professional workstation build based on your budget constraints.

Machine 2: Desktop/Gaming Build
DIABLO 3! Take your old machine and dump a new graphics card into it. A GTX460/HD6850 on the low end or pick up a GTX570 a few weeks after the GTX670Ti is released. This should be the least expensive machine.

Two machines... keep your professional workstation away from your gaming one... as far as possible.
m
0
l
April 6, 2012 5:56:25 PM

realmisr said:
^How exactly does that work? Excuse my ignorance..I have never owned 2 graphic cards in 1 rig. If I connect a quadro to my main monitor, how would I use the Geforce, say for gaming?

In all honesty, I think even a quadro 4000 is plenty for diablo 3, which is probably the most graphic intensive game i plan to play (for now)

I guess you just sli them, and use quadro for viewport and such things, and games use SLI, so it will work as SLI-ed thing.

But, I'm not sure how exactly is this.
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 6:14:01 PM

realmisr said:
I'm still open to suggestions as far as a new rig goes. Maybe you could give me some input on that. I want to run LGA2011 as it seems the most promising, and maybe even dual LGA2011 with that new EVGA X79 motherboard...but my concern lies in cooling 2 cpus. I want to do liquid cooling, but it was already a pain in the ass to set that up with my current rig (1 cpu) from 4 years ago. Also I'd have pretty much no option to swap out for new gear unless I take all the watercooling stuff out and reinstall...which I would never want to go through...though maybe that's easier now with the new towers having that easy CPU swap cutout?

I think my main goal is just to have a powerful processor/s with loads of cores to work with for rendering, and I want to put 64gbs of ram in there as well. And ofcourse, that Quadro when it comes out.

As this is a pretty serious rig I would want at least 2 CPUs. The Duel LGA2011 setups are going to be quite pricy due to the cost of the new Xeon CPUs. Personally I would go for a duel LGA1366 Xeon 5600 series setup for a cost/performance perspective... but then again I am a small-time/part-time media person, so I find it hard to justify paying more than $700 for a single CPU (and honestly the i7 2600 I got for my recent upgrade has plenty of power for the level of work I do), so if money is not an issue then the Xeon LGA2011 platform would be the better option.

Also, looking at benchmarks, the AMD buldozer platform seems to keep up quite well in productivity work, and may be worth looking into in spite of its issues for general use and gaming.

As for water cooling, I don't think it would be that much harder to add a 2nd CPU to a water cooling loop as it is just one more length of hose to add between the CPUs (rad, pump, cpu1, cpu2, return/resavour). Just be sure to get a rad large enough to disparate the heat of 2 processors (which should not be hard as the current Intel stuff runs pretty cool to begin with). Keep in mind also that a lot of these newer oversized air coolers can run quite silent while keeping things acceptably frosty without the cost of water cooling (granted, if you already have decent water cooling parts then I guess cost really isn't an issue as you can reuse most of it).

Just remember a few things: Xeon LGA2011 is not the same as the SB-E LGA2011. SB-E will work with x79 chipsets, and Xeons will work with the C600 series chipsets, and the two are not compatible with each other. In other words there will not be a duel x79 motherboard that ever goes into production because intel killed it in favor of making duel boards xeon only. http://www.pcper.com/news/General-Tech/Took-you-long-en...
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 6:19:31 PM

janiashvili said:
I guess you just sli them, and use quadro for viewport and such things, and games use SLI, so it will work as SLI-ed thing.

But, I'm not sure how exactly is this.

Ya, sounds fishy to me as well. I know the 4000 and 460 are similar hardware, but will SLi allow them to run paired together like that? Or in the driver can you select which card you are using, and use the SLi connector as a pass-through for a single gaming GPU (thus bypassing the 4000 for gaming)? If that were the case then I would put in something faster/cooler than a 460 for the gaming card.
m
0
l
a b U Graphics card
a b 4 Gaming
April 6, 2012 6:24:08 PM

th3parasit3 said:
Two machines.

Machine 1: Professional Build
Spec out your professional workstation... pour your hard earned cash into it because there is a return on investment. Quadro 5/4000 seems fine, consider a V7900 and xFire down the road if you need. If you build your own... then look at Xeon processors and motherboards.... load that baby up with ECC RAM. ECC will save you time with fewer crashes when you are running long builds or renders.

You may also choose to go the consumer i7 route... LGA2011 boards often support up to 8 DIMMS, and you can OC the heck out of the RAM and the processor. You'll probably get a better bang for your buck going this route... but no ECC RAM.

Pick your professional workstation build based on your budget constraints.

Machine 2: Desktop/Gaming Build
DIABLO 3! Take your old machine and dump a new graphics card into it. A GTX460/HD6850 on the low end or pick up a GTX570 a few weeks after the GTX670Ti is released. This should be the least expensive machine.

Two machines... keep your professional workstation away from your gaming one... as far as possible.

+1

With win7 games do not kill/interfere with pro software like it use to, but it could still produce undesired effects (like gaming when you should be working ;)  I know that's my problem sometimes). And if you already have a decent rig then just transform that one into your game rig when you are ready to upgrade. GTX570 has plenty of horsepower for games, and the prices are already beginning to fall on them... Used would be even cheaper.
m
0
l
April 6, 2012 6:25:03 PM

CaedenV said:
Ya, sounds fishy to me as well. I know the 4000 and 460 are similar hardware, but will SLi allow them to run paired together like that? Or in the driver can you select which card you are using, and use the SLi connector as a pass-through for a single gaming GPU (thus bypassing the 4000 for gaming)? If that were the case then I would put in something faster/cooler than a 460 for the gaming card.


I guess it's just SLIing them. You can randomly find some information on the web

But, I don't think it's used for gaming, I mean this concept, I think it's just to get benefit of quadro and faster it with geforce cards.

m
0
l
April 6, 2012 9:26:47 PM

CaedenV said:
As this is a pretty serious rig I would want at least 2 CPUs. The Duel LGA2011 setups are going to be quite pricy due to the cost of the new Xeon CPUs. Personally I would go for a duel LGA1366 Xeon 5600 series setup for a cost/performance perspective... but then again I am a small-time/part-time media person, so I find it hard to justify paying more than $700 for a single CPU (and honestly the i7 2600 I got for my recent upgrade has plenty of power for the level of work I do), so if money is not an issue then the Xeon LGA2011 platform would be the better option.

Also, looking at benchmarks, the AMD buldozer platform seems to keep up quite well in productivity work, and may be worth looking into in spite of its issues for general use and gaming.

As for water cooling, I don't think it would be that much harder to add a 2nd CPU to a water cooling loop as it is just one more length of hose to add between the CPUs (rad, pump, cpu1, cpu2, return/resavour). Just be sure to get a rad large enough to disparate the heat of 2 processors (which should not be hard as the current Intel stuff runs pretty cool to begin with). Keep in mind also that a lot of these newer oversized air coolers can run quite silent while keeping things acceptably frosty without the cost of water cooling (granted, if you already have decent water cooling parts then I guess cost really isn't an issue as you can reuse most of it).

Just remember a few things: Xeon LGA2011 is not the same as the SB-E LGA2011. SB-E will work with x79 chipsets, and Xeons will work with the C600 series chipsets, and the two are not compatible with each other. In other words there will not be a duel x79 motherboard that ever goes into production because intel killed it in favor of making duel boards xeon only. http://www.pcper.com/news/General-Tech/Took-you-long-en...


Thanks for the input, that's very informative!

I'm trying to think if those dual xeons would be better than a i7-3960X? Not sure if its all of them, but the new Xeons seems to be locked so we can't OC them?
m
0
l
a b U Graphics card
April 6, 2012 11:56:03 PM

realmisr said:
Thanks for the input, that's very informative!

I'm trying to think if those dual xeons would be better than a i7-3960X? Not sure if its all of them, but the new Xeons seems to be locked so we can't OC them?

Dont buy dual Xeons yet because I would rather wait for IB-E(Ivy Bridge-Enthusiast) because they will take full advantage of the 22nm which is said to beat the current SB-E in anything... IB-E is one of the best CPU that there is when it comes out... Recommend buying a 3820 now and upgrade later...
m
0
l
April 7, 2012 12:05:36 AM

legendkiller said:
Dont buy dual Xeons yet because I would rather wait for IB-E(Ivy Bridge-Enthusiast) because they will take full advantage of the 22nm which is said to beat the current SB-E in anything... IB-E is one of the best CPU that there is when it comes out... Recommend buying a 3820 now and upgrade later...


I see. They say it won't be released until 2013 though? You're right, I may have to just upgrade later :??: 

How hard is it to upgrade a CPU once it's set for watercooling? Would the towers with the CPU cutout help? I don't even want to imagine changing my current setup's CPU since it'd require me to take all of the watercooling apart...which is not something I want to get a migraine from doing.
m
0
l
a b U Graphics card
April 7, 2012 1:06:36 AM

realmisr said:
I see. They say it won't be released until 2013 though? You're right, I may have to just upgrade later :??: 

How hard is it to upgrade a CPU once it's set for watercooling? Would the towers with the CPU cutout help? I don't even want to imagine changing my current setup's CPU since it'd require me to take all of the watercooling apart...which is not something I want to get a migraine from doing.

Taking off a CPU that's been watercooled is the same thing as taking off a Heatsink from a CPU, only different is that it could cause water to leak which is unlikely if you have it pretty long... Just move the waterblock aside than put it back on when you replace the CPU...
m
0
l
April 7, 2012 2:32:46 AM

^hmm...you're right.
m
0
l
a b U Graphics card
April 7, 2012 4:00:44 AM

BTW realmisr, IB-E will be released during Q4 2012 like SB and IB 1 year apart, im not totally sure if it's Q4 2012(probably too early to release them) but maybe 2013 Q2 at latest...
m
0
l
April 7, 2012 4:14:17 AM

Do you reckon it would beat dual xeons (E series) in what I need to do? (photoshop, maya, 3d rendering, some gaming)

Regardless, IB-E wouldn't be able to be used in a dual cpu setup right?
m
0
l
a b U Graphics card
April 7, 2012 4:55:51 AM

realmisr said:
Do you reckon it would beat dual xeons (E series) in what I need to do? (photoshop, maya, 3d rendering, some gaming)

Regardless, IB-E wouldn't be able to be used in a dual cpu setup right?

There's some recent Xeon cpu released but if you want, buy them which is C600 chipset but I recommend you get 22nm Xeon which is next year or 2 years from now(2014)... Just get the 22nm IB-E later...
m
0
l
a b U Graphics card
a b 4 Gaming
April 8, 2012 1:33:38 AM

All this talk of waiting for IBE is well and good, but it will still be a single CPU, and still will not be released until Nov at the earliest. The current or last gen Xeons in duel is the best way to go for the most horse power for production work. And the Duel/Quad Opteron setups are nothing to scoff at either for these types of workloads, and may be a bit cheaper.
m
0
l
a b U Graphics card
April 8, 2012 2:05:03 AM

CaedenV said:
All this talk of waiting for IBE is well and good, but it will still be a single CPU, and still will not be released until Nov at the earliest. The current or last gen Xeons in duel is the best way to go for the most horse power for production work. And the Duel/Quad Opteron setups are nothing to scoff at either for these types of workloads, and may be a bit cheaper.

He's not waiting, He'll be upgrading to it... IB-E is said to take full advantage of 22nm which mean it'll probably comes in 8/10 Cores and 16/20 Thread or a 6 Core 12/Thread and clock high or can OC very high... There's a possible chance that IB-E is a 10 core and 20 thread since it uses less power and less heat plus it have space to be put in... The fact that IB-E could be a 10 core is that SB-E was suppose to be a 8 Cores but they disabled 2 because of the power consumption and the speed would be low too, probably at 2GHz per core for 130w...
m
0
l
April 8, 2012 2:14:51 AM

CaedenV said:
All this talk of waiting for IBE is well and good, but it will still be a single CPU, and still will not be released until Nov at the earliest. The current or last gen Xeons in duel is the best way to go for the most horse power for production work. And the Duel/Quad Opteron setups are nothing to scoff at either for these types of workloads, and may be a bit cheaper.


The thing I'm worried about is getting the new dual xeons, but they are locked and can't be OC'd. Would a good ivy bridge OC'd give me more power?
m
0
l
a b U Graphics card
a b 4 Gaming
April 8, 2012 5:59:00 PM

realmisr said:
The thing I'm worried about is getting the new dual xeons, but they are locked and can't be OC'd. Would a good ivy bridge OC'd give me more power?

It depends on the type of workload you are giving it, and how the software is written.

For example:
-iTunes, and many other cheap/free media transcoding software is single threaded, which means you give it 2 cores (one for the program and another for windows and background processes to run on), and crank up the clock rate as high as possible as it is purely a GHz race to the finish.
-Most games will only use 4 cores. Anything more than that (other than to run windows and background processes) is entirely a waste, which is why most gamers choose the i5 instead of an i7, because the HT cores simply do not add anything for them.
-Adobe Premiere, and other big production software, is extremely parallel in nature, and will eat as many cores as you give it. For these types of programs you don't care as much about clock speed (though faster is always nicer), as much as you care about feeding it cores to work with. So in this case, getting a slower, but more parallel, CPU/s is often the most effective route to go.

If you are worried about OCing then go for the previous gen Xeons which I believe are OCable (double check me on that, I could be wrong), or go to the AMD side where you can OC without question. But I am fairly sure that multiple CPUs is going to be better than a single (though a single is likely more than adequate for 99% of users out there).

Lastly, there is no word on how many cores IB-E will have. My bet is that it will be the same as SB-E with 4 to 6 cores with HT, and that we will not see "many core" CPUs until Haswell/Broadwell show up in another 2 years. But it is just conjecture, and there is no way to know for sure. I am almost positive though that SB-E and IB-E will not be compatible and you will need to upgrade the mobo with the new CPU if you go that route.
m
0
l
April 9, 2012 2:15:03 AM

CaedenV said:
It depends on the type of workload you are giving it, and how the software is written.

For example:
-iTunes, and many other cheap/free media transcoding software is single threaded, which means you give it 2 cores (one for the program and another for windows and background processes to run on), and crank up the clock rate as high as possible as it is purely a GHz race to the finish.
-Most games will only use 4 cores. Anything more than that (other than to run windows and background processes) is entirely a waste, which is why most gamers choose the i5 instead of an i7, because the HT cores simply do not add anything for them.
-Adobe Premiere, and other big production software, is extremely parallel in nature, and will eat as many cores as you give it. For these types of programs you don't care as much about clock speed (though faster is always nicer), as much as you care about feeding it cores to work with. So in this case, getting a slower, but more parallel, CPU/s is often the most effective route to go.

If you are worried about OCing then go for the previous gen Xeons which I believe are OCable (double check me on that, I could be wrong), or go to the AMD side where you can OC without question. But I am fairly sure that multiple CPUs is going to be better than a single (though a single is likely more than adequate for 99% of users out there).

Lastly, there is no word on how many cores IB-E will have. My bet is that it will be the same as SB-E with 4 to 6 cores with HT, and that we will not see "many core" CPUs until Haswell/Broadwell show up in another 2 years. But it is just conjecture, and there is no way to know for sure. I am almost positive though that SB-E and IB-E will not be compatible and you will need to upgrade the mobo with the new CPU if you go that route.


I mean I'll only be using Photoshop, Maya, Zbrush, Realflow for most of the time. I know that the actual rendering of the frames strictly use CPU, though I don't know if the number of cores or the speed of the cores matter more. If it's the speed, then there is no need for me to get so many cores if I could achieve just as good or better results with 1 cpu. At most, I have photoshop + maya at once...rarely do I have more than 2 simultaneous softwares open at the same time in my workflow.
m
0
l
a b U Graphics card
April 9, 2012 5:47:01 AM

CaedenV said:
It depends on the type of workload you are giving it, and how the software is written.

For example:
-iTunes, and many other cheap/free media transcoding software is single threaded, which means you give it 2 cores (one for the program and another for windows and background processes to run on), and crank up the clock rate as high as possible as it is purely a GHz race to the finish.
-Most games will only use 4 cores. Anything more than that (other than to run windows and background processes) is entirely a waste, which is why most gamers choose the i5 instead of an i7, because the HT cores simply do not add anything for them.
-Adobe Premiere, and other big production software, is extremely parallel in nature, and will eat as many cores as you give it. For these types of programs you don't care as much about clock speed (though faster is always nicer), as much as you care about feeding it cores to work with. So in this case, getting a slower, but more parallel, CPU/s is often the most effective route to go.

If you are worried about OCing then go for the previous gen Xeons which I believe are OCable (double check me on that, I could be wrong), or go to the AMD side where you can OC without question. But I am fairly sure that multiple CPUs is going to be better than a single (though a single is likely more than adequate for 99% of users out there).

Lastly, there is no word on how many cores IB-E will have. My bet is that it will be the same as SB-E with 4 to 6 cores with HT, and that we will not see "many core" CPUs until Haswell/Broadwell show up in another 2 years. But it is just conjecture, and there is no way to know for sure. I am almost positive though that SB-E and IB-E will not be compatible and you will need to upgrade the mobo with the new CPU if you go that route.

There is now :D 
http://vr-zone.com/articles/ivy-bridge-ep-and-ex-coming...
I thought I already explained how and why it could be 8/10 cores

EDIT: When the 14nm CPU comes in(year 2014) the CPU is ganna get higher clock speed and with the same wattage... 14nm CPUs is already a guess of how many cores it has, 8/10 cores and also probably can be OCed to 5GHz with probably around 200w...
m
0
l
a b U Graphics card
April 9, 2012 5:59:02 AM

realmisr said:
I mean I'll only be using Photoshop, Maya, Zbrush, Realflow for most of the time. I know that the actual rendering of the frames strictly use CPU, though I don't know if the number of cores or the speed of the cores matter more. If it's the speed, then there is no need for me to get so many cores if I could achieve just as good or better results with 1 cpu. At most, I have photoshop + maya at once...rarely do I have more than 2 simultaneous softwares open at the same time in my workflow.

The Cores is what matter but depend on the speed, you'll do faster than a 10 cores at 1.6GHz with a 6 Cores at 5GHz... Multipl2(2) CPUs socket is better BUT if it's barely 12 Cores at 2GHz and cant even OC, than a 10 Cores 22nm OC to 4GHz at 250watts can do better probably a lot...
m
0
l
a b U Graphics card
a b 4 Gaming
April 9, 2012 3:42:30 PM

AMD Duel Socket:
mobo $430, http://www.newegg.com/Product/Product.aspx?Item=N82E168...
6 SATA2 connectors, and a mess of SAS connectors, quad channel memory, duel CPU

CPU $540ea, $1080 total http://www.newegg.com/Product/Product.aspx?Item=N82E168...
16 cores each, 32 cores total

Ram $140 per 16GB (4x4GB kit), max of 4 kits, can go with higher density ram, but I think the 4GB modules should provide the best space/$ at the moment, and I don't think you need more than 64GB of ram for this build. I am going to put 32GB in this build's cost http://www.newegg.com/Product/Product.aspx?Item=N82E168...

PSU: ~$120, ~600-800W depending on the final build, ATX, 80+ Silver or better
May need adapters for some of the 12v/8pin CPU connectors on the mobo

Core build: $1910
Obviously this still needs a case, drives, GPU, etc. But to have 32 cores and 32GB of ram for under 2 grand is nothing to sneeze at. Stock this is 2.1GHz, but (unless I am severely mistaken, which could be as I am very sick at the moment and not thinking too clearly) this should OC pretty well. I like Intel better, but a comparable Intel build would be much more expensive (nearing $4K).
m
0
l
a b U Graphics card
a b 4 Gaming
April 9, 2012 3:44:59 PM

oh, and I am sure you can find better prices, and a wider variety of parts on server grade equipment somewhere other than the Egg, this was just posted as an idea.
m
0
l
April 9, 2012 4:32:15 PM

^That's a great idea for a build, thank you. 32 cores and max 64gbs of ram is definitely very enticing.

I am also much more of an intel guy, but this is definitely more cost efficient. I just wonder if it's overkill for Maya rendering to have 32 cores...or would it actually be used.
m
0
l
a b U Graphics card
a b 4 Gaming
April 10, 2012 7:59:20 PM

AMD do suffer from a lack of performance on a great many workloads, but they do not fall down so badly on production workloads like 3D rendering and such. On some benchmarks Intel wins, and on others AMD takes the cake for this type of use, it is not the all-out slaughter that AMD suffers with other workloads such as gaming and lightly threaded applications.
As a comparison of a quad core with HT i7 vs an 8 core AMD
i7 2600 vs FX 8150 http://www.anandtech.com/bench/Product/434?vs=287
i7 3820 vs FX 8150 http://www.anandtech.com/bench/Product/434?vs=523
(*comparing the bigger 2011 chips would be unfair as they have more threads)
For photoshop, 3DS MAX (I know it is not Maya), Blender, and a few encoding softwares the AMD chip comes out on top on a thread per thread and clock per clock basis. I am not exactly sure how this translates into the server packages, but I am sure that it cannot be that far from these on a similar basis.

On top of that we are talking a $1000 difference in cost. Obviously if you have the money then Intel is the all-around better product, and does the same workload at nearly 1/2 the power requirement, and absolutely kills AMD on some workload. But for your use in specific I think the AMD would still be very cost effective. But if you have money to burn, you would be happier with Intel.


... man it is weird suggesting AMD lol I am not exactly their biggest fan or supporter.
m
0
l
a b U Graphics card
a b 4 Gaming
April 10, 2012 8:01:13 PM

Anyone know where to find Opteron vs Xeon benchmarks? It's annoying that there is such a lack of specific information on the expensive stuff, and a plethera of information on the cheaper consumer level stuff.
m
0
l
a b U Graphics card
April 11, 2012 1:59:38 AM

Intel performs better... You get what you pay for... 8 Cores and 16 Threads... 32 virtual cores if two xeon is together...
m
0
l
April 17, 2012 11:55:17 PM

Best answer selected by realmisr.
m
0
l
!