To me these look nearly identical (with some minor differences that are mainly to do with the Xeon being dual-CPU capable and having a larger memory capacity/address-space). This, of course, when putting aside the fact that there would be TWO Xeons but only ONE Core Extreme.
Furthermore, when looking at "modern" dual-Xeon motherboards, such as the EVGA SR-2 among others, I get the impression that it is no longer the case that such motherboards are "strictly for server usage". They now appear to come with exactly the same (gaming/desktop-related) features as the Core motherboards (such as multiple PCIx16 slots with SLI capability etc.)
What I do NOT need to know is this:
- How fast is one compared to the other etc.
- Which one is cheaper / better value for money etc.
What I DO need to know is this:
Would a dual-Xeon based configuration tend to have me run into "software trouble" of any kind - Be it with installing and running Windows smoothly or with DirectX / Direct3D not working properly with games or any such "lack of compatibility"?
... Or is the only argument AGAINST a dual-Xeon configuration the one I've heard a number of times: "You probably won't be able to exploit all the power of the dual Xeons in most applications / games" (this being something I can live with, while unnecessary problems aren't)..?
Also, am I overlooking something, when I say that the motherboards for these different CPU's look as if they have much the same features (say, OC features in the BIOS or other gaming/enthusiast features a nerd like me might want/like to have)..?
Would a dual-Xeon based configuration tend to have me run into "software trouble" of any kind - Be it with installing and running Windows smoothly or with DirectX / Direct3D not working properly with games or any such "lack of compatibility"? COLGeek: Probably not an issue as drivers and apps are now designed to work with multi-core systems much better than in years past. You would need to run a version of Win 7 that can actually use the second CPU, like Ultimate x64. I don't see an issue here for modern hardware and software.
... Or is the only argument AGAINST a dual-Xeon configuration the one I've heard a number of times: "You probably won't be able to exploit all the power of the dual Xeons in most applications / games" (this being something I can live with, while unnecessary problems aren't)..? COLGeek: This is likely true to some degree, many apps/games can't fully take advantage of more than 3 cores today. That trend will improve over time, but with the amount of computing horsepower you are describing, you aren't likely to use it all a lot of the time. BTW, have you seen the latest Maximum PC Dream Machine? Very similar basic config to the system you are pondering.
Also, am I overlooking something, when I say that the motherboards for these different CPU's look as if they have much the same features (say, OC features in the BIOS or other gaming/enthusiast features a nerd like me might want/like to have)..? COLGeek: You need to consider the "costs" to run such a machine. The economic cost to build AND operate such a system will far exceed your "normal" gaming rig. This sort of config will also generate tons of heat (and potentially noise). Your electric bill is also sure to take a whack.
While true most applications do not use more than 2 cores except video and audio encoders Windows itself is a multi-threaded OS which benefit.
With a system like the Dual Xeon rig you could have many multiple apps open (I mean MANY) at the same time with no lag.
I use an older Duallie Xeon and love it. Even though is older arch it benchemarks like C2D's.
They call it "Creamy Xeon Goodness" because of how smooth a system like that it.
Check out 2CPU.com. Those forums specialize in Duallie rigs.
If you got the money GO FOR IT.
Once you go Duallie you dont go back!
If i got it right that would be 8c/16t or 16 logical processors which with the right OS (Server 2008 or 7 Ultimate) which would show 16 CPU's in task manager?
Im weird but looking at that alone would be worth it LOL
Since my first post I also emailed EVGA about this, and like you they too assured me that their SR-2 motherboard would run any modern OS/application without trouble, and furthermore that it was in fact particularly suited for my needs, assuming these were for a combination of high-end gaming rig and serious graphics workstation (which is close enough, give or take a few magnitudes of performance )
[They also implied that you can't use two PSU's, unless they're made for it (which is kind of obvious, now that I come to think about it, considering the construction of modern PSU's). This might be a problem for a rig that would conceivably require in excess of 1600W at full load (not "peak" mind you, but continuously). However, that little problem is for a completely different thread...]
I am now sufficiently convinced that the concept is in fact possible, to leave me with just the small matter of whether to spend an obscene amount of money on a rig that is way more powerful than what I actually need. To that end, I've done a quick "guesstimate", by throwing together some likely components in various shopping baskets around the internet and adding their totals...
... Even when assuming the "next best thing", with regard to the speed of the Xeons and the number of nVidia cards, but otherwise a pretty top-end configuration, the result is still a quite perverse amount, which is to say somewhere in the vicinity of 9000 USD!!!
On a different note, in relation to king smp's latest post I can't help but provide a small correction:
Nope, that would be 12c/24t - Latest Xeons have SIX cpu cores
(And on top of that, the SR-2 can run up to FOUR cards at x16
... Which would be no less than 1920 physical gpu/cuda cores)
Dang, I may be weird too, but I'm SOO tempted to go ahead with this project, even if it is currently only in a few benchmark applications that you can really see that much computing power at work...
Just to tickle your collective imaginations, here are some of the components I consider using:
mobo: EVGA SR-2.
cpu: 2x Intel Xeon 6-core @ 3.3 GHz (possibly even faster, so far available up to 4+GHz).
gfx: 2x (or even 4x) EVGA nVidia GTX 480.
mem: 2x 12GB (or even 24GB) Corsar Dominator DDR-3 @ 1333 (or faster, if available).
dsk: 2x 250GB Intel SSD drives (for OS/applications/swap, storage is handled by my NAS).
[All peripherals, such as BluRay drive, floppy drive, card reader, etc. will be external USB or eSATA devices = No front-bay devices in the case at all.]
... And now to the really fun stuff:
case: Mountain Mods U2-UFO Crystal Ship (custom fabricated w/o 5.25" bays or fan-holes in front).
- Water-cooling throughout (cpu, ram, mobo full-cover, gfx cards full-cover, psu as well if possible).
- I consider using Koolance blocks throughout, mainly since it would look good (coordinated at least).
- Might deviate from that plan though, depending on how they tend to perform in reviews.
- Two or three Laing pumps running separate loops (with shared radiators/reservoir).
- Probably D5's, but I might go with DDC's for various reasons.
- One for CPU0 + RAM0, One for CPU1 + RAM1 and one for GPUs + Mobo.
- Two sets of two high-flow triple-fan radiators (w. fans sandwiched between them).
- 120mm Low-noise fans throughout (w. rubber gaskets and mountings).
- Temperature probes on all relevant parts and relevant places in the water loops.
- Everything connected to a Koolance multi-fan/pump controller and temperature regulated.
- Some kind of discrete case-light, likely UV cathodes + UV reactive molex, shrouds and water.
- Mechanical flow-meters (water-wheels) in strategic places.
Finally, there is the issue of the power-supply. Ideally I would LOVE to use the fantastic looking, oil-immersed and water-cooled 1000W PSU from Koolance, but alas, 1KW is not nearly enough I fear, and my (quite optimistic, looking back) plan to use TWO of these, is obviously a no-go
EVGA has recently introduced a PSU especially made for the SR-2 motherboard, but I find it hard to believe that its 1200W will be sufficient, if one were (theoretically) to fill the board to capacity with power-hungry gfx cards (which would be more than 1000W already, even before adding the Xeons, not to mention everything else..?)
Well, I guess I can find some PSU or other, if not the one from EVGA, with good enough specs to handle a continous load of 1600+ Watts, even if it is unlikely to be available in a water-cooled version (which, of course, is not at all necessary, or even particularly advantageous - I just like to water-cool as many components as I possibly can, as long as it makes at least some sense to do so - After all, once I've taken the pain to mount the reservoir, hoses, radiators, pumps and all, I might as well get the most of it...)
If I ever get around to wasting all my money on all this extreme hardware, I might as well decide to go all the way, and try to implement this small idea that I've been thinking about lately:
I consider taking two HDD water-blocks (again, the ones from Koolance look quite nice), and sandwich a number (2-4) of TEC's (Peltier elements) between them. This way it should be possible to transport as many Watts of heat-energy as the TEC's are able to move, from the water passing through one of the blocks to the water passing through the other block. This "device" would then be connected in the following manner: The water returning from the hot PC components is lead through the "hot" block, where it is heated further before going through the radiators. The cooled water returning from the radiators is then lead through the "cold" block, where it is cooled further, before going through the hot PC components once again. The result of this should be two-fold: The radiators will be slightly hotter, and therefore more efficient at removing heat from the water, and: The water going through the components will be (slightly) cooler than otherwise, and thus likewise remove heat more efficiently. It is, in other words, a bit like a "turbo-charger", but for water-cooling systems
But I fear I have once again drifted off-topic here...
Anyway, once again thanks for all your input. It is now up to me (and my pockets), to decide if it is worth the money to make this monster a reality or not. Even so, further comments are of course welcome...
A quick addendum to my previous mega-post, after re-reading some of your replies:
With regard to "increased power-bill":
I might be naive, but I am laboring under the impression that modern hardware is quite good at lowering its power-consumption, both when completely idle and during "partial loads". Thus I would expect even this heavy rig to have a decent idle consumption, which would then increase gradually as I begin to load the system. Since the system would be so powerful, it follows that it will spend less time being "busy" (be it at full or partial load), and therefore return to its low-consumption state relatively quicker, somewhat "making up for" the higher consumption while it works.
This is not to say that the system wont use more power than a "normal" rig, but it shouldn't be THAT much worse, at least not while doing "office/desktop" tasks, where most of the time is spent idling while waiting for the user to press a key or move the mouse. Unless, of course, I'm overlooking something crucial here?
Anyway, I'm quite prepared to pay the electrical bill. The problem, if any, is in paying all that dough up front...
With regard to "a lot of noise":
Actually, I'm VERY particular about my PC's being ultra-quiet. As you can see above, I plan to use quite a lot of radiators and fans, which is to ensure that all the latter can run at low speeds and be of the "ultra low-noise" type, while still providing plenty of cooling capacity.
This is also one of the reasons why I plan to water-cool as many components as possible, since this is inherently more silent (if done correctly) than air-cooling.
To prove the latter point: My current PC, which was a top-of-the-line gaming PC when it was built some 6+ years ago, has just ONE 120mm fan (the one in the PSU, which is internally temp-regulated, but never runs faster than its slowest speed). It makes about 20 dBa (a "soft whisper") of noise (most of it from the somewhat noisy pump), despite running VERY cool - CPU < 55, chipset < 40, GPU < 60 deg.C. @ full load and 24 deg.C. ambient - And the GPU is only that hot because of a poor water-block.
I must admit that the above is mainly achieved by using a Zalman Reserator 1 (a huge, passively convected radiator/reservoir combo that stands 60cm tall on my table besides the case). While I've been VERY impressed with the performance of the Reserator (and, notably, the lack of maintenance, despite me being a heavy smoker), I must further admit that the problems associated with having a huge "cooling-tower" attached to ones case, by means of two water-filled hoses, turns out to be somewhat of a bother.
So the idea, this time around, is to use a lot of radiators and fans instead, so they will almost work as if they were passive (which is to say: Provide ample cooling even with slow-running low-noise fans). This way I can keep everything inside the case, but hopefully still achieve a relatively quiet machine despite its muscle...
I'm glad to hear you say the ST1500 is the best (large) power supply, as this was also the one I arrived at.
While there is a reason for why I might consider a dual-PSU or modded PSU solution, I would MUCH prefer NOT having to void the warranties of my hardware (I must assume the warranty for everything connected to either of these PSU setups would have their warranties voided, which would include 2 Xeons, a number of GTX480s and a SR-2 = LOTS of money).
The (single) reason for why I would at all consider such a thing, is that I would LOVE to be able to use the water-cooled 1000W PSU from Koolance (and using TWO of these certainly wouldn't look any less awesome). However, this reason is mainly about looks, while the actual need for, and sense behind, using a water-cooled PSU is somewhat hard to see.
I get the impression that, yes you CAN use two PSU's in one system, BUT: Unless you ONLY use one of these for HDDs, fans and likewise equipment, that isn't directly connected (electrically / power-wise) to the components powered by the other PSU, you run the risk of serious trouble (including having all your hardware ruined) - If I follow the tech-speech correctly, the problem is that the two PSU's would need to be "load-balanced" in order to prevent situations where one or the other PSU is suddenly doing all the work (which would make it shut down in a best case scenario) and/or where one PSU is trying to send power into the other one (which I guess would be VERY bad?)
Anyway, the ST1500 is certainly a nice PSU, and has ample power for my intended rig. So I guess I will just have to live without that nice looking stainless-steel heat-exchanger (on the Koolance PSU) in my case.