Sign in with
Sign up | Sign in
Your question

2012 Render Farm Magic

Last response: in Systems
Share
February 22, 2012 11:38:22 PM

I'm looking to build a 6 node render farm for 3D animations out of C4D, Maya, and some composites out of After Effects. And I want to build ALL of it inside an Ikea Helmer Cabinet!

Inspiration:
http://www.braverabbit.de/playground/?page_id=630

Pretty dope! Except for the fact that their build is a hackintosh system (which I am NOT doing. I'm going all Windows.) Their list of Components is as follows:

- Gigabyte H61N-USB3, Intel H61 chipset, ITX form factor

- Intel Core i7-2600k 4×3.4GHz 8MB-L3 Sock1155 (SandyBridge) – Boxed

- G.Skill DIMM 2×8 GB DDR3-1333

- Western Digital WD1600BEKT 160 GB 7200rpm 2.5inch

- SFX Power 350W

- Coolink SWiF2-801 fan (80cm x 80cm x 25cm)

Since I'm not going the Hackintosh route, what would be a better build for a Windows-based render farm? My goal is to have each node come out between $600-$800. I don't need a monitor or the casing obviously as I'm building it in the Ikea Helmer Cabinet. Any advice/tips/tricks would be useful! I'm excited to build this thing!

Cheers,
Phil

More about : 2012 render farm magic

a b B Homebuilt system
February 23, 2012 12:54:24 AM

Well, you could go P67/Z68 and overclock the 2600Ks. I'm not sure how the cooling works here, but if you can figure something out you can get a whole bunch of extra performance for freeeee
m
0
l
February 25, 2012 2:58:08 AM

If you are going to overclock you could use a different Ikea case and give your hardware a little more breathing room. This is my farm in the bottom half of an Ikea PS cabinet.



Master (x1): Asus M4A87TD/USB3 | AMD Phenom II x6 1100T | G.Skill Sniper 1333mhz 4x4GB | Corsair CX600v2 | 2 - WD Black 1TB in RAID0 | 1 - WD Green 2TB backup | ASUS DRW-24X DVD | PowerColor HD 5450 1GB
Slave (x3): Asus M4A87T | AMD Phenom II x6 1100T | G.Skill Sniper 1333mhz 4x4GB | Corsair CX430v2
Misc: Netgear GS108 Switch | Debian Squeeze | Ikea PS Case | CM Hyper 212+ HS
m
0
l
Related resources
Can't find your answer ? Ask !
March 2, 2012 6:08:14 PM

MikeOnBike, that looks CRAZY! good job man.

Yikes about the Ikea case. Do you think if I take off some panels and cut some more holes I would be okay? Or do you think the case I have is going to be a big issue? It'd be awesome if I don't have to get another case/find the space for a bigger case, but if you think that it would be rather destructive then obviously I'd have to change things up. Maybe add an extra fan on the top to suck out any lingering hot air?

At Kajabla,
Thanks for the input, I went with a Z68 mother board, and I'm looking into over clocking the 2600k, though I've never done this before so I'm going to start up the system and get it working first and maybe try that at a later date once I know things are working smoothly.

This is really exciting as I've never done anything quite like this before. MikeOnBike, How did setting up your render farm work for you? Did you have any kinks to work out you could give me advice on? It'd be dope to talk to you once I get these parts and chat about the things you did and what not. Much appreciated guys!
m
0
l
March 2, 2012 8:31:13 PM

Well I'm not 'really' worried about your case. It's just that you mentioned overclocking. Extreme overclocking generates a LOT of heat. Most of the Helmer projects I have seem don't mention overclocking or state they don't intend to. You might be able to get away with a moderate overclocking.

Your CPU is a 95w TDP, mine is 125w so I potentially have 31% more heat to dissipate than you do at stock speed. Once you get it built then some experimentation will let you know what you can do with the Helmer case. If you do want a high overclock you will probably need room for bigger fans or be willing to deal with the noise of more fans. I'm using a 140mm/1200rpm fan for each node. I might do some mild overclocking but it will depend on what my summer temps allow with my house temp, workload and how well my case breaths. I reduced my CPU temp by 10c to 40c by swapping the stock fan for the CM Hyper 212+.

I haven't set up the rendering yet. My son is headed to Virtual Technology and Design school and has started to mess with Blender. I need to get it installed on the nodes.

Actually I need to get the other three nodes running this weekend. My slaves are diskless and headless. I need to configure the master to support netbooting of the slaves.

I have a list of Helmer projects I'll post here this weekend. You might be able to find a Wintel project that is similar to what you want to do. I think your hardware setup would be similar to the hackintosh build but your software install will be different. This cluster will also run some AI, and science projects.

What did you end up with for hardware? It sounds like you have purchased some since your first post.

Ping me at my email if you want to chat about what I'm doing. owyheesage at gmail

m
0
l
March 6, 2012 4:42:21 PM

Blender should be fun! I've messed around with it a fair bit. That sounds awesome. The hardware I ended up going with was:

- i7 2600k
- Zotac Z68 ITX (I wanted the ASRock Z68M-ITX, it had better reviews.. but I couldn't source it in time).
- RipJaws 2X8 GB x series ram
- WD Scorpio Black 160GB SSD
- Dynex 400W PSU

The Zotac Z68 was a nightmare! The mini display port was faulty, theres very little support, and it took me a while to get through all the error codes. But alas it is up and running now and currently installing After Effects. This is a very exciting moment for me. I feel like a father.

Thanks for the tip about overclocking. I think I'm going to hold off until I get an actual render complete. One step at a time for me here.
Another question I had was how you actually manage your render farm? Whats the best way to go about it as you obviously won't have a monitor connected to each one? Is there a good remote desktop management software you use to install updates/plug-ins etc? Or do you just hook up a monitor to each one every time you need to change something, I'm in the dark as to how to actually manage a cluster like this.
m
0
l
a b B Homebuilt system
March 6, 2012 10:03:29 PM

The Scorpio is not an SSD. It's a hard drive.
m
0
l
March 10, 2012 4:12:08 AM

Aye! HD, not SSD!
Why did I write SSD?
m
0
l
March 14, 2012 7:06:13 PM

MikeOnBike,

That would be awesome! And of course do what you gotta do!
That blog post is very cool by the way. Definitely geeked out with some of my colleagues checking it out.
m
0
l
August 22, 2012 5:02:40 AM

This is great help, I am about to invest in helmer farm, choosing hardware at the moment, and I am thinking in advanced about cable management, so maybe sacrificing the 6th node will be better for me if I can hide all those cables, switch and surge protector, and maybe use the 6th node space as extra ventilation room with 2 more fans.

Anyway, will post more news as I buy more stuff, glad to see people are building this on their own!

Art
m
0
l
August 24, 2012 8:11:21 PM

My clusterF project has been converted to Win7 to run 3DS but is currently idle while I focus on keeping my job amid more layoffs. Might have to put it up for sale since it has seen little use since completed.

http://coyoteridgeobservatory.blogspot.com/2012/03/clus...


Philip, how is your project? Pictures?
m
0
l
April 21, 2013 12:52:04 PM

"Whats the best way to go about it as you obviously won't have a monitor connected to each one? Is there a good remote desktop management software you use to install updates/plug-ins etc? Or do you just hook up a monitor to each one every time you need to change something, I'm in the dark as to how to actually manage a cluster like this."

most straightforward hardware approach is simply to use a KVM switch. they're very cheap nowadays and very simple to use.
software-wise, remote desktop management is built into the WinXP/7/8 Professional versions (and up). Google Chrome has a browser-based plug-in that allows browsing of one's desktop remotely over the internet. See what fits best for you. ;) 
m
0
l
April 21, 2013 2:01:59 PM

Jus Sayin said:
"Whats the best way to go about it as you obviously won't have a monitor connected to each one? Is there a good remote desktop management software you use to install updates/plug-ins etc? Or do you just hook up a monitor to each one every time you need to change something, I'm in the dark as to how to actually manage a cluster like this."


VNC is another good tool for connecting to/from Windows and Linux systems.
m
0
l
!