CGI Rendering / RTS Gaming Workstation: E5-2600 V2 Xeon or AMD?

luci5r

Distinguished
Sep 19, 2011
88
0
18,640
I'm building a CGI Rendering & RTS Gaming Workstation, which will feature a Dual CPU configuration with a high-end GPU.

The workstation will serve two main purposes:

  • ■ CGI/Rendering: Mainly Maya, zBrush, RealFlow development & modeling; and Maxwell Renderer for rendering. Maxwell is a CPU-Only renderer but can utilize unlimited cores/threads. GPU Usage is very minimal - only for Viewport display purposes, which any good gaming card can handle.
    ■ RTS Gaming: No FPS/Shooters/Crysis etc! Only RTS games like StarCraft II, Diablo III, Company of Heroes, etc. I want to play these games on Max settings. These games generally use both GPU & CPU extensively, for which I'll be going with either GTX 780, or 2 X 9950 CF or something like that.

This was going to be an Intel build and I had selected the E5-2640 V2 (8-Core) CPU which is expected to release sometime in September. However, someone I know suggested that I should also look at AMD. My understanding has been that Intel is better for rendering, especially if the renderer supports multi-threading and doesn't have cap on # of cores. But I could be wrong & that's why I'm posting this.

My budget is around $1k per chip; installing 2 chips so about $2K total. The E5-2640 V2 w/ 8-Cores/16 Threads fits perfectly in this and should release within a month priced around $1,000.

Is there any reason I should be looking at AMD instead? And if yes, I'm completely out of touch with AMD - no idea what's currently available in the market and what's about to come in. My build won't be ready till Oct-Nov since AMD Radeon HD 9950/9970 are expected to appear Oct-Nov and they are more then likely the GPU I'll be going with (I have a separate thread about this here in the GPU section).

Please advise.

Thanks!!
 
Solution
You would only go with AMD if price were an issue. The FX series is better at your purpose when compared to something similarily priced in the consumer market, like an i3 or i5.

Xeons are an entirely different ball game. They're better.

elemein

Honorable
Mar 4, 2013
802
0
11,160
You would only go with AMD if price were an issue. The FX series is better at your purpose when compared to something similarily priced in the consumer market, like an i3 or i5.

Xeons are an entirely different ball game. They're better.
 
Solution

luci5r

Distinguished
Sep 19, 2011
88
0
18,640


For me, Performance is far more important then Price. I do have a budget, like anyone else, but if AMD's consideration is based price being an issue - then I'll stick to Xeon.

My understanding was also very similar to what you summed up in your last sentence - Xeon's are better. I'm not sure why my friend asked me to look in to AMD for this. Really not sure.

Thanks though - It's always good to have your knowledge recertified & backed by knowledgeable people.

 

elemein

Honorable
Mar 4, 2013
802
0
11,160


The higher TDP allows them to run at "full throttle" longer without being throttled at all; assuming cooling is okay. That is why they are usually server chips, where they are stressed heavily, for long periods of time, but cooling is rarely an issue due to server masters spending big money on cooling multiple server racks at a time.
 

genz

Distinguished


That sounds to me like an oxymoron. Let me illustrate.

Wikipedia:
The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of power the cooling system in a computer is required to dissipate.

Now basic overclocking knowledge will tell you that increasing voltage is what generates more heat, not clocks. So increasing TDP means you need more cooling, not less. I don't see how it connects to throttling over time, as that would be a direct product of heat dissipated, which is harder to do if you have a higher TDP to start with.

In short, how exactly does a higher TDP chip run at 'full throttle' longer than it's equivalent Core i7 with the same cooling? Saying it can but with more cooling is nonsensical because with more cooling the i7 would run for longer at 'full throttle' too.

Not trying to disprove your point really, just trying to understand it.
 

elemein

Honorable
Mar 4, 2013
802
0
11,160


You're exactly right.

TDP = Thermal Design Power.

Now, we know that heat can certainly be measured in Watts, correct?

So, let's take two CPUs of exactly equal clock; let's just say, for the hell of it; and two i5-3570Ks @ 4.5 GHz (lets say Turbo is on and Speedstepping and all those temperature throttling mechanisms are on.)

Let's say one has the standard 77W TDP, and the other has 100W TDP.

Now, let's say you stress each, all 4 cores @ 100%. Eventually, even if both CPUs are kept at the exact same temperature (lets say you have some kind of amazing cooling and they're both at 40C at max load), the 77W CPU WILL throttle the clock throughout the run momentarily to keep within it's TDP, as TDPs are usually sold for "situations"; it doesnt care if it's running at only 40C, it expects to be in a consumer PC that can only consume X amount of watts and generate X amount of heat. That is why Intel Atoms normally don't run above 40C very often (unless you count Intel Burst Technology on Clovertrail+ platforms), even though 40C isnt dangerous whatsoever to the chip and hell, the tablet isnt even getting anymore than a little warm at that point. It throttles itself that far to save power; it is made for a certain situation.

Now, of course, the 100W i5-3570K will throttle less as it's specification allows it to consume more power and generate more heat.

A great example of TDP in action is: ( http://www.legitreviews.com/evga-geforce-gtx-780-ti-kingpin-6gb-video-card-made-overclocking_130114 ) This card has NO TDP, meaning it will not exhibit any throttling unless it is reaching a thermal safety mechanism (because this is, 2013 after all.) This is basically like candy to an overclocker.

Now, speaking of overclocking; you brought up a valid point. "But what about overclocking that raises voltage, temps, and clocks?"

Wonderful question!

Though tell me; what overclocker raises their clocks, temps, and heat significantly while leaving things like AMD Cool'n'Quiet and the like on? Very few. And if they do, they exhibit a lot of throttling (which is why folks who overclock without turning these features off always question why they cannot maintain their clocks...)

So, with most of these thermal/safety/whatever-you-wanna-call-them-this-month-thingies mechanisms OFF, an overclocked CPU can certainly run healthily above it's TDP to a certain extent; unless it comes across a hard thermal safety limit (again... This is 2013.)

Now, you may ask: "I never really notice my clocks being throttled even if I stress my CPU @ 100% for an hour". Well, the thing is; it does happen. Period. (the idea is that consumers almost never need every single execution unit 100% of the time... I mean; who runs AVX instructions alongside BMI instructions alongside integer instructions alongside all the other ISA extensions of the rainbow @ 100% for an hour straight and ISNT a server? Not many applications, I tell you. A stress test certainly wont.) However, in most specifications, the manufacturer is smart enough to match clock limits to TDPs. This is why you see laptops with 17W TDPs with much lower clocks than a 35W TDP laptop which has much lower clocks than a 77W desktop part. Of course, without overclocking (in which case you'd disable half the thermal throttling crap the manufacturer tacks on), it's rare you'll ever reach the situation as a normal consumer where you'd hit that TDP limit where you need to be throttled down so drastically that you notice a massive speed reduction (usually throttles aren't massive downclocks; there are so many speedsteps in a CPU it's a little ridiculous. IIRC, Llano 35W laptops had 11... I of course only ever used B0; the Boost one where I overclocked usually. :ange: ); as throttles are normally only maybe 300-500 MHz at a time. Which, at some 4.0GHz, is truly a little hard to point out and say "Throttling! I see it!"

Of course, until you hit a safety limit at some 85C, which the CPU will effectively drop out of the sky and limit itself to awful performance where you can see the difference as clear as day. Asus Vivobook X202 owners know what I mean; the damn thing had awful thermal design. Samsung Series 5 535U3C owners will know what I'm talking about too... Damn thing.

So yeah; basically: Higher TDP = more performance.

If you have any other questions, feel free to ask!