How To: Building Your Own Render Farm

Making Slave Units For Your DAW

For those of you working with digital audio instead of 3D and 2D rendering, there are methods you can use to add slave systems to your primary DAW that can be especially helpful if you are trying to use a large number of virtual instruments for a live performance. Virtual instruments can range from synthesizers like Reaktor 5 to software samplers like HALion 3. Plug-in effects processors run the gamut from reverb and equalization to vocal tuning and other types of advanced audio processing. Soft-synths and effects tend to be processor-intensive. Meanwhile, software samplers need to move multiple streams of audio data from the hard drive through effects and out to your audio I/O, making them more I/O intensive than 3D rendering, but requiring more random accesses than something like video editing. Either type of software package can rapidly tax a system beyond its limits. Naturally, stacking the two together is even more demanding.

The easiest way to create a slave for your DAW is to buy an off-the-shelf device, such as the Muse Research Receptor. Pick one up, put it on your network, install your plugins on it, and then offload them from your primary system. This configuration yields an immediate performance benefit, with no tweaking needed to get it all working. This is great for a musician who isn't so technically-inclined. However, Receptor units are a bit pricey, especially for the hardware you’re getting. In essence, a standard Receptor 2 is a single-core 2.7 GHz Linux box with 4 GB of RAM, a 250 GB hard drive, audio interface, and the vendor’s proprietary Linux VST host. Total cost? Roughly $2,000. The Receptor 2 Pro offers a dual-core unit and a larger hard drive, starting at about $600 more.

For those more technically-minded enthusiasts on a budget, it might make sense to look for another solution. Without the audio card, the Receptor 2 base unit is really just a sub-$400 PC, while the Receptor 2 Pro consists of less than $600 in hardware.

If you're willing to lose the dedicated controls on the front, you can use the same approach we’ve applied in building render nodes to building a slave for your DAW. Install FX-Max' FX Teleport on the node, and from your primary DAW you can add plugins that will run on the node. The node doesn't need any fancy audio I/O options because FX Teleport will stream the audio over the network and back to your DAW software.

There are also other solutions, such as ipMIDI, which allow you to send MIDI over Ethernet or synchronize multiple DAWs together for a fraction of the cost of the Muse Research systems. However, you should note that this method would likely mean having additional licenses for your DAW software and VST plugins, while using FX Teleport apparently does not. From there, you can easily build a much lower-priced machine that far outstrips the performance of available Receptor systems. Quad-core DAW slaves, anyone?

  • borandi
    And soon they'll all move to graphics cards rendering. Simple. This article for now: worthless.
    Reply
  • Draven35
    People have been saying that for several years now, and Nvidia has killed Gelato. Every time that there has been an effort to move to GPU-based rendering, there has been a change to how things are rendered that has made it ineffective to do so.
    Reply
  • borandi
    With the advent of OpenCL at the tail end of the year, and given that a server farm is a centre for multiparallel processes, GPGPU rendering should be around the corner. You can't ignore the power of 1.2TFlops per PCI-E slot (if you can render efficiently enough), or 2.4TFlops per kilowatt, as opposed to 10 old Pentium Dual Cores in a rack.
    Reply
  • Draven35
    Yes, but it still won't render in real time. You'll still need render time, and that means separate systems. i did not ignore that in the article, and in fact discussed GPU-based rendering and ways to prepare your nodes for that. Just because you may start rendering on a GPU, does not mean it will be in real time. TV rendering is now in high definitiion, (finished in 1080p, usually) and rendering for film is done in at least that resolution, or 2k-4k. If you think you're going to use GPU-based rendering, get boards with an x16 slot, and rsier cards, then put GPUs in the units when you start using it. Considering software development cycles, It will likely be a year before a GPGPU-based renderer made in OpenCL is available from any 3D software vendors for at least a year (i.e. SIGGRAPH 2010). Most 3D animators do not and will not develop their own renderers.
    Reply
  • ytoledano
    While I never rendered any 3d scenes, I did learn a lot on building a home server rack. I'm working on a project which involves combinatorial optimization and genetic algorithms - both need a lot of processing power and can be easily split to many processing units. I was surprised to see how cheap one quad core node can be.
    Reply
  • Draven35
    Great, thanks- its very cool to hear someone cite another use of this type of setup. Hope you found some useful data.
    Reply
  • MonsterCookie
    Due to my job I work on parallel computers every day.
    I got to say: building a cheapo C2D might be OK, but still it is better nowadays to buy cheap C2Q instead, because the price/performance ratio of the machine is considerably better.
    However, please DO NOT spend more than 30% of you money on useless M$ products.
    Be serious, and keep cheap things cheap, and spend your hard earned money on a better machine or on your wife/kids/bear instead.
    Use linux, solaris, whatsoever ...
    Better performance, better memory management, higher stability.
    IN FACT, most real design/3D applications run under unixoid operating systems.
    Reply
  • ricstorms
    Actually I think if you look at a value analysis, AMD could actually give a decent value for the money. Get an old Phenom 9600 for $89 and build some ridiculously cheap workstations and nodes. The only thing that would kill you is power consumption, I don't think the 1st gen Phenoms were good at undervolting (of course they weren't good on a whole lot of things). Of course the Q8200 would trounce it, but Intel won't put their Quads south of $150 (not that they really need to).
    Reply
  • eaclou
    Thanks for doing an article on workstations -- sometimes it feels like all of the articles are only concerned with gaming.

    I'm not to the point yet where I really need a render farm, but this information might come in handy in a year or two. (and I severely doubt GPU rendering will make CPU rendering a thing of the past in 2 years)

    I look forward to future articles on workstations
    -Is there any chance of a comparison between workstation graphics cards and gaming graphics cards?
    Reply
  • cah027
    I wish these software companies would get on the ball. There are consumer level software packages that will use multiple cpu cores as well as GPU all at the same time. Then someone could build a 4 socket, 6 GPU box all in one that would do the work equal to several cheap nodes!
    Reply