Render Server (Opteron or Xeon)

Status
Not open for further replies.

Myflag

Honorable
Feb 17, 2012
18
0
10,510
Hello Community Member,

I have a several questions for anybody here that would kindly lend their assistance, opinions, and (hopefully) some factual evidence toward solving my dilemmas.

I am currently in need of a dedicated render server to free up my work station for further work. My work station is current am Intel 2600k at 3.4GHz with 16GB of RAM. While this computer has and continues to work well it is limited for rendering more... intensive scenes and animations. It is not uncommon for a single frame render to take thirteen hours (advanced lighting, complex transparencies, ETC...) and animations taking thirty minutes or more per-frame even without GI. Needless to say, I am in need of an extensively more powerful and dedicated rendering machine.

From Newegg, I am currently considering building myself either a dual Xeon or dual Opteron server for rendering. Now, the only problem I am having is decided on which platform to invest in for a dedicated render server. I am currently debating between the following:

1. 2x Opteron 6272 2.1GHz 16-Core

2. 2x Xeon E5645 2.4GHz 6-Core

Now, my primary question to the kind people on these forums. Which setup would you recommend for the greatest rendering performance with Cinema 4D? If you can not in good conscious recommend either of the above platforms, than what would you recommend that is comparatively priced?

Any and all assistance will be greatly appreciated.

Thank you,
Myflag
 
Opteron is cheap and offers the most amount of cores for the buck but personally there is no real mid range xeon platform that is affordable. The opteron that you linked is BD based and is weaker than what it replaced so it is cheaper than Mangy cores opterons.

Intel = Performance per thread and per clock
AMD = Lots of cores/threads

I don't know the total floating point unit performance at that clock for those 16 core options but if the total is greater than the xeons you linked you are better off going opteron unless you can manage higher clocked xeons.
 
G

Guest

Guest
i found a benchmark with 2 x Intel Xeon X5650 and 2 x AMD Opteron 6174 with Cinebench R11.5 which uses Maxon's Cinema 4D engine
http://www.bit-tech.net/hardware/cpus/2011/11/14/intel-sandy-bridge-e-review/4
and here is a large, though both cpus are not on it, database of Cinebench scores:
http://www.cbscores.com/index.php?sort=rend&order=desc

it seems that opterons do perform better and are less expensive.
Intel Xeon E5645 $557.99
http://www.newegg.com/Product/Product.aspx?Item=N82E16819117256
AMD Opteron 6272 $539.99
http://www.newegg.com/Product/Product.aspx?Item=N82E16819113036

 

Myflag

Honorable
Feb 17, 2012
18
0
10,510
In the past on my workstations I used a Q6600 and I overclocked it to 3GHz to save the money from purchasing the higher-end CPU. That works perfectly with my Q6600 for years without any issue whatsoever. With my current CPU I left it at stock as there was no need for the increased speed.

Would a similar solution be acceptable with the Xeons? I have just read several reviews of overclocking the E5645 and they have all achieved 3GHz+ with minimal effort. While I am hesitant to overclock a server platform, even with perfect stability, if I must to save thousands than the risk would be worth it.

Any more recommendations would be greatly appreciated.

Thank you

PS: K1114 - The information I have gleaned suggest that the new Intel series is only approximately 15% faster and in any case is several months away...

 
G

Guest

Guest
try thinking about:
EVGA SR2 at 200 Bclk, default voltages
2xXeon E5645 at 3800MHz Turbo on, default voltages, 2xCoolerMaster V6GT
2x24Gb Kit GSkill Ripjaws 9-9-9-24 1600MHz (12x4Gb)
4xZotac GTX480 Quad-SLI, default clocks and voltages
2x WD Caviar Black 2Tb 7200 RPM
Silverstone Strider ST1500W
LianLi V2120X

found it on a sig on an evga thread:

http://www.evga.com/forums/tm.aspx?m=856864&mpage=1
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
What's your budget? Does it need to scale?

It will be hard to beat AMD for the sheer core counts.

You can get a quad CPU AMD board for $800 and fill it out as you need.
4 x $389 - AMD Opteron 6234 Interlagos 2.4GHz x 12-Core

That's 115 Ghz processing power for $2356. You could do 2 now and 2 later if your needs grow. Drive/memory prices will be comparable for either system.

The better question is does Cinema 4D actually scale to 48+ cores?






 

Myflag

Honorable
Feb 17, 2012
18
0
10,510


My concern with going with the a quad CPU motherboard is that the one I found on Newegg only works with the 12 Core older Opterons. As a result it makes them slightly slower and as a result having only two CPU's would make the server only slightly faster than an overclocked Ivy Bridge 3000K series quad core. The exception being that the server is significantly more expensive... which is another one of my concerns. I do not to inadvertently waste financial resources.

In any case, after doing further research, I found that I would need to purchase a server OS to use a quad CPU system... which my experience with is limited and I am unsure of Cinema 4D's capabilities on such an operating system.

I must admit, the idea of that much rendering power in a single machine is quite intriguing.

I know that Cinema 4D can handle at least up to 64 threads.

As it stands, I believe I will go with the Xeons (with a higher clock) unless somebody knows how to make an Opteron give the same rendering performance for less money.

k1114: You mention Sandy Bridge and Westmore... do you think the performance difference will be great enough to actually warrant waiting multiple months? Some of the information I am using is coming from this link...http://www.bit-tech.net/hardware/cpus/2011/11/14/intel-sandy-bridge-e-review/4
 
Most sources are saying march but is 2 months too long if it does happen to be april? I wouldn't suggest it if it wasn't a viable option.

Another option may be a render farm, you could probably get a lot more bang/buck but then need a bit more time to set it up.
 

Myflag

Honorable
Feb 17, 2012
18
0
10,510
As it stands right now, I am currently using my Vaio Z and my iMac to assist my workstation in rendering animations. One of the primary reasons why I want a single render server, instead of a render farm, is to ease the production of images. I do not want to have to spend time 'piecing' back together images from a render farm. I am also planning to place the server in a back office so I don't have to hear it running 24/7...

Now, if we are confident that the new CPU's will be out in two months, at a comparable price, and their overclocking potential is significantly more than the current Xeons I just may wait. With the current Westmore chips I am seeing perfectly-stable speeds of 3.4Ghz without any voltage changes. Which is something that greatly intrigues me.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290


While i'm quite sure your Q6600 was agreat chip back then, like the 2500k is now, this is another totally different ball game. MP configrations none the less. BTW OP, does you rendering software supports clustering ?
 

Myflag

Honorable
Feb 17, 2012
18
0
10,510
If by clustering you mean Net Rendering across multiple computers, than yes. If you glance at my post above yours, you will see that I mention that I use my Vaio Z and my iMac to assist my workstation in animation rendering. Though, I did not specifically say that I was using clustering. The reason why I am hesitant in purchasing something along the lines of two or three 2500k/2600k CPU's and running them instead is I do a significant amount of single frame image work in addition to my animations. While a single frame can be spread across multiple computers to assist in rendering, you have to manually 'piece' the image back together afterwords. Not something I want to spend time doing on a regular basis.

Now we are back to the reason why I want to build a multiple-processor render server that I can just stick in an office with weeks of work queued where it can render 24/7. To summarize I want a machine that that I can feed projects and have it constantly pumping out my rendered images and videos with minimal interference to my time or work. In addition, I know if I spend much more than what the basic Xeons or Opterons cost it would become more economically effective to build a small render farm instead... not a path I want to go down due to multiple issues; in which only some of have been mention.
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290


HI, depending on what you work with that last step usually isn't necessary. Can you tell us what is the software. The application might be cluster aware. Most of those applications are cluster aware. Not saying to go to the server 2008 R2/RHat path, but, you would be saving a lot o money and getting a decent bang for your buck. And don't forget to buy a decent switch also. With a decent backplane.

I say this because the price of 2 MP ( opterons + xeons ) + the board + Memory + Special PSU. , you can easily buy 2 nodes. Hell, you can even go for 4 nodes, if you test it first and tune the perfect configuration. And while MP boxes have their advantages, they can't beat individual computers. Especially if the load is too big.

Just check if it is cluster aware, and more details about that application/software. A cluster is scalable, just add more if you want, and if one fails the work continues albeit slower. If your big rendering box fails, work stops. Oh , and BTW, overclocking xeons and opterons on MP configurations is a big NONO. Usually the boards don't support it.
 

Myflag

Honorable
Feb 17, 2012
18
0
10,510
I am using Cinema 4D which can handle up to 64 threads for each computer and render across an unlimited number of systems simultaneously.

The rest of the information that you are inquiring about is in my initial opening-post.

EDIT: I was considering overclocking the Xeons when I noticed the great performance increase that can be acquired with minimal effort. The motherboard linked below supports Xeon overclocking... though I would prefer not to overclock if there is a cost-effecting alternative.

http://www.newegg.com/Product/Product.aspx?Item=N82E16813188070
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290


Considering you already have a C4D called Net Render, it is really easy to setup a cluster. There are two applications, the server ( not free ) and the clients ( free ). Animations you don't need to stitch up, still images you do need to stitch up. (Couldn't find info on the latest versions). Anyway, there is a shitload of information on the net about C4D / net render / clustering.

Net Render already loads balances for you, and it must be windows boxes. Any ( tuned) Win 7 Pro should be fine. From a quick read, it eats a bit of ram but not much and little HDD space. Honestly mate, a 4 MP box can go for 7k+ USD/EUR easily. That 4 cluster could be done with 1500 to 2000 dollars splurging a bit. Or below 1500 tightening the belt a lot.

About the stitching, i came across some replies the talked about a fix, but they weren't related to version 11 , but to version 9 of the so-called net-render. Technically speaking is much better to have a cluster, and even better taking into consideration that Net Render load balances for you , and for what i've read, the Network part isn't too important.

So, it is up to you. remember any rendering box should have the OS tuned. Most closely possible to the minimum. The last cluster i've deployed were on WS Server Core 2008 R2.

So, you just really need to ponder that stitching situation and possible ways to solve it.


 

Myflag

Honorable
Feb 17, 2012
18
0
10,510


The primary problem is there is no easy method in which to solve stitching them back together without requiring even more of my already thinly stretched time. For example, I know that I can build a 4 MP box from Newegg for $3400 but I don't believe I would really want to spend that much on a single render server. Not when for the same cost, or even less, I could probably build four individual render stations with 40%+ performance.

I believe I am going to go with the dual Xeon approach. In any case, I could also purchase another server later on if the need where to arise.

Thank you all for your assistance and advice.

Myflag
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810
With that kind of varying render load you might want to use the "cloud" for the big jobs. Prices range from pennies per frame to $150/day for 64CPU clusters.

I was thinking a farm of overclocked i5@4.5Ghz boxes might work better as well. For the cost of just the Xeon E5645 you can almost build a complete i5 system.

Other things to note for true server builds:
-) RAM is 2x or more expensive due to needing ECC
-) Overclocking is almost non-existent but you'll have 28.8Ghz of CPU
-) Easier to manage 1 computer but you're paying a heft premium for that

Now consider a i7-3930K system that's 22.8 Ghz of CPU before any overclocking.
You could build 2 of these systems for the cost of the Dual Xeon.

If stitching is that big an issue I'm sure someone has a tool/script/app for that.
 

Myflag

Honorable
Feb 17, 2012
18
0
10,510

I am going to do some research on Newegg and estimate prices of i7-3930K systems.

EDIT: Another issue I just noticed with the i7-3930K is that they are sold out on Newegg. While I don't have to have a render server immediately, it does make me wonder about future availability and whether I should just wait for the server edition Sandy-Bridge E's that may be out in a month or two.

I'm also thinking about the overclocking performance review from this article... http://www.bit-tech.net/hardware/cpus/2011/11/14/intel-sandy-bridge-e-review/4

I'm now thinking that buying Xeons at the moment may not be the best idea.... but if the Xeons are overclocked they appear to provide about forty percent more performance than a i7-3930K

Any opinions or advice community members?
 

Myflag

Honorable
Feb 17, 2012
18
0
10,510
I must admit... the amount of conflicting information about the dual Xeons vs the i7-3930K is significant. I look at one benchmark/review and it shows the i7-3930K is almost on par with the dual Xeons while another website will show that the dual Xeons are about 70% faster than the i7-3930K.

Any further advice will be greatly appreciated.

Thank you
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290


Xeon vs I7. Simple. Xeons are just normal CPUs , ready for MP configurations , with a bit of microcode for some "special" operations ( DBs, VMs ) with another type of warranty or sticker.

If you are doing Oracle, DB2, Clustered Web pages, JavaVMs, VMs or some more esoterical HPC work, the Xeons would win.
Maxon C4d use only SSE2, if recall correctly, witch is really old. So for you , for the exception of the possibility of MP configurations, Xeon = i7. If core count, frequency and Cache (L1,L2,L3) are the same, they will behave the same.

It is a hard decision you have to make. And i'm quite sure with limited money.

Btw : http://www.newegg.com/Product/Product.aspx?Item=N82E16819113036

Compatible MB : http://www.newegg.com/Product/Product.aspx?Item=N82E16813131643

Ussually image and video scale really well with cores. Why not a 2x Interlagos with 32 cores/threads ? Everything is for sale in newegg. Just remember with the OS, please go server path.

Just make sure , you pick something with the size of my NOX Blaze X2. And with at least half the fans. Cool on 24/7/365 environments is really important for stability.
 
Cazalan stop multiplying cores x ghz that's like multiplying 4 wheels on a car going 20mph and saying it's going 80mph. Cpu performance is measured in flops. Cores and ghz is pretty irrelevant in this case with everything being used at it's full potential; it's all about actual performance.
 

Myflag

Honorable
Feb 17, 2012
18
0
10,510
I've finally come to a conclusion over several of the dilemmas that have been plaguing my mind regarding the CPU choices.

The more research I do and the greater I see even the Xeons overclocked to only 3.8GHz may only be stable for several days at a time... not including the weeks of hassle with overclocking. My concern is that the i7-3930K is in a similar boat; even if it is not, I can not image a CPU running at 4.5GHz+ being conducive to long term 24/7/365 stability. As a result, I have a strong concern that overclocking, or even planning on overclocking, could be potentially catastrophic.

If I am wrong in any of the above information I ask to be corrected.

With no overclocking and long-term stability (24/7/365 at 100% load) does that mean that the 2x Opteron 6272 2.1GHz 16-Core CPU would give me the most performance-per-dollar?
 

radnor

Distinguished
Apr 9, 2008
1,021
0
19,290


A 10 to 15% overclock shouldn't do too much damage long term. The chips would be ready for it. A 2.2Ghz would run at 2.4Ghz for example. A 3.3 would run 3.6 Ghz. You would get some free performance without affecting durability and uptime. Internal voltages wouldn't be change and CPUs would mostly run under specs.

Here is some Interlagos review on Anandtech. It also has rendering benchies. This was a pre-fix benchmark, so you can add 1-3% more performance with current OSes.

http://www.anandtech.com/show/5058/amds-opteron-interlagos-6200/11
 
Status
Not open for further replies.

TRENDING THREADS