Considering Data Centre Co-Location and looking for systems:

Greatings all,

I am considering data centre co-location and looking for systems to house in said data centre.

Cost: Total Cost of Ownership (TCO) is a factor
Performance: Performance, is obviously, a factor
Value: Value of the above two will be a factor

OS: Linux & Windows will be used
VM: Virtual Servers will be used,
- VMware (and/or similar products) is likely be used.
- Microsoft Virtual Server is likely to be used.

Lights out Management is a likely going to be a requirement (If you don't know what this is, think twice before posting, however I do appreciate any useful information). Add-in Light out Management cards may prove useful, as they might not be required.

Building my own is an option, looking at Tyan, Iwill, SuperMicro, etc

Also, purchasing outright systems (+support, including 1 x OS) is also an option, looking at HP, Dell, etc

Looking at 4 RU of space, in Australia, so am considering 3 x 1 RU 'blade' style servers with 1 x 1 RU of switching, etc

After any reasonable configuration.

Seeking in depth technical details on options available. (eg: 8+ way, 4 RU, server rack, 2 x 2 RU, etc)

Per 4 RU only 1 public / internet IP address might be visible. (Can likely arrange more, but please just consider it).

Also seeking licensing information on running multiple Microsoft Virtual Servers on a given machine. (And no I don't want 'use pure Linux' replies, that is just simply not an option, anyone with Linux experience should know this).

The machines will be used for various purposes, not just 'game server hosting'. Web-Hosting, FTP, etc are also likely uses.

Now, let the 'options', including details such as cost (in Australian Dollars, from Australian sources only), performance, etc roll in

Note: I will just be clicking [Ignore] on any BS replies.
12 answers Last reply
More about considering data centre location systems
  1. Bump, Day 1.

    Come on, surely at least one person on these forums actually does half the stuff they claim too do... surely. :P
  2. Quote:
    Come on, surely at least one person on these forums actually does half the stuff they claim too do... surely.

    I wouldn't count on it!!!
  3. LOL.

    They ain't all that bad... ;)
  4. I'm a systems/network admin.

    I understand you are factoring in:

    -4 Total units of rack space
    -1 unit of switching
    -1 public IP address available
    -Remote access card (lights out management)
    -Being used for games and probably other applications

    What I'd like to know is:
    -How much are you willing to spend on the project?
    -How much downtime are you willing to deal with per system failure?
    -What additional infrastructure is available? (if one public IP address is available is the data-center going to NAT for you?)
    -Why do you feel you need a lights-out management solution?

    Based on the information you've provided, I would suggest looking at the offerings from the major server manufacturers (DELL, HP, IBM etc). As much as I love building machines, the turnaround for repairs can take forever. Just recently I had to replace some RAM in a system, sure I could have ponied up the $100 and bought a new stick, but I went through the manufacturer, who sent me to the took no less than a month to get the replacement RAM. In the server-world this is unacceptable.

    All of the major manufacturers offer some type of remote access (LOM) card. I'm only familiar with Dell and they have the DRAC, used with Openmanage can give you a complete picture of what the server is doing. On the Windows platform it's easy enough to use Terminal Services instead of a DRAC + additional software on the machine. If your infrastructure permits, secure port 3389 behind the firewall, setup a VPN account to get into your server(s).
  5. - AU$5,000 or so on hardware

    - 20 min downtime a day would be acceptable, but ideally much less. ;)

    - Quantity of IP address(es) is unlikely to be an issue, as applications can be run on various different ports.

    - The one unit of routing is only 'one of many' possibilities. Although it could serve as an additional level of protection. (NAT/PAT wise, if you get my gist).

    - Light out Management via 2nd IP (less public / private) address if possible, in case the machine is having 'boot-up' issues and need BIOS access, etc

    - Even if I had DRAC (for example) I'd still most likely use RealVNC, Terminal Services, Remote Desktop, etc in addition to Lights out Management. :) (There are times when you really wish you'd used a form of LoM).

    - I'd only really look at Dell if I was after cheap Xeons, etc. I am fond of Hewlett Packard / Compaq laptops and servers,... just not their desktops. (More of a DIY person when it comes to desktops anyway).
  6. Quote:

    I highly recommend HP over Dell. (Had some good server-level experince with them including customer support)

    As for the rest, that's beyond my knowledge.

    Aside from our HP-UX Servers, we run Dell boxes exclusively. I must say that over the years I've been pretty satisfied with Dells' level of support. I have never encountered a situation where I really needed Tech Support. I've never encountered a catastrophic systems failure on any of our Dell Servers. We even have a couple of legacy boxes circa 1996 that are still running.

    I've always received parts in a timely manner. If I have a system under warranty that needs hardware replaced (usually disks)--I've had them the next business day...

    I will admit however; that I have been screwed by Dell more than once. We bought into their 1655MC Blade Servers and within a year of doing so they were discontinued and we were unable to fill our Chassis.

    Most recently Dell has *attempted* to dick us around regarding system pricing. We met with our Regional Rep and she had stated that Dell was giving away FREE PRINTERS with substantial orders. We put in for a quote for XX desktop systems + FREE PRINTERS. They discounted our desktops $300 each and added in the cost of the printers. The downside to this is that on our normal orders we generally receive a $300 discount to begin with. :twisted: (so why would I want to buy a Dell printer?)--I'll take the discount instead.
  7. $3713.62 American.

    -Rather than go with DRAC/LOM you may want to look at the cheap KVM over IP solutions. I've seen a small add-on device that is $400 USD that will give you bios-level access to the machine. I've never played with DRAC/LOM as I have no need for it. I do not know if it allows you to hard-reboot a system...but it has been my experience more often than not that when a server hangs I just walk down to the datacenter and reset it. (so the KVM over IP may not be a viable solution over LOM if LOM allows reset function).

    For $3713 that puts you at an entry level Dell (perhaps an HP/IBM guy can make a recommendation for those vendors).

    The 2850 is a pretty nice box, but the features you are getting for the base price of $3479 aren't much:
    -Xeon 2.8/2MB cache 800 FSB
    -512 RAM
    -PCI Xpress Riser
    -Embedded Raid Controller PERC4 PCIX
    -Remote Access Card included (subtract $231 to remove this option)
    -CD ROM
    -2 73G Hard disks (10K)

    These machines are capable of dual procs and 12G of memory...16G if you want to enable redundant memory feature...or if you have $7478 to drop on that.

    We've currently got 2 of these running a SQL 2005 cluster on 64bit.

    You could also come close to picking up a couple of PowerEdge 850's:

    -Celeron 2.53/256K 533FSB
    -1G ram
    -PCI EX Riser
    -2 80GB SATA Disks 7200RPM
    -NO OS.

    Shitty thing about this is no raid...unless you went software and I'm not a big fan of that.
  8. For the mix of stuff I am doing, especially over multiple virtual machines, etc, the aggregate memory performance of the Opteron really comes into play.

    For the same dollar I could get like +25% (apx, maybe more) more performance on an equal priced AMD Opteron vs Intel Xeon system.

    However, I am likely to wait for the Xeon 5100 series 'Woodcrest' core and AMD Opteron 'K8L' core before playing my hand, but still looking at all my options. (There are just too many to filter and pick, by the time I select something it could be out of date :P - Or not... I tend to keep a few months ahead, at least, on hardware).
  9. Where I work we are getting Dell 2850's... I don't like them, and for the cost I dunno if they are worth it. (Maybe Dell in America are heaps better than Dell in Australia,... I just don't trust them). (Where I work we have heaps of them, and yeah they are 'OK', but I wouldn't call them fantastic).

    Also the performance of the Xeon 2800 isn't going to cut it for much of the things I am looking at above. (Imagine running multiple virtual machines with fairly moderate - high load, all sharing 6.4 GB/sec FSB :P - They just won't scale well for my requirements - See above for general idea).

    The 'oldest' Xeon I am willing to touch is 'Sossaman' (Core Duo based), the Netburst based ones, and their associated chipsets, do not interst me in any way. (They wouldn't cope with the load).

    If Dell come out with a kick ass Xeon 'Woodcrest' system, then yeah, I'll at least look at it.

    AU$5000 should get me a system with at least 4 GB PC3200, Reg, ECC PC3200 DDR SDRAM (128x4 layout, 4 x 1024 MB DIMMs) down here already. (See my current sig for my 'home' workstation - My 'home' PC can house 32 GB RAM using 4 GB DIMMs, not 'offically' but it can :P - Offically only supports up to 16 GB :D ).
  10. I don't fault you. :D

    Truth is I haven't dealt with anything other than Dell here at work. They work well for what we do...maybe not so much for what you do.

    I haven't kept up much with the trends in the industry...until it comes time to build a new home PC...then I do my research. Worrying about how much better one processor is from the next doesn't impact me much work-wise...though I am a big fan of AMD and I'd love to see some AMD boxes here.
  11. duplicate posting. I apologize.
  12. Use the project management software calculator from the following link
    to calculate your project management software cost
Ask a new question

Read More

Motherboards Systems Servers