I'm not particularly well-versed with computers so forgive me if I'm asking redundant questions. It has recently come to my concern that my electricity consumption may have sky-rocketed because of my new hardware.
Some background information: I am a small-business owner who runs a server business from several properties I have warehouses on. I bought 20x HP DL140 servers and am currently run them 24 hours/day.
It's been a little over 3 months since I began, and the hydro-company which provide only send statements every quarter.
I'd like to know how I should go about calculating the operation costs of the servers so I can set aside an partition for operation cost payments.
I know on the ad it states 650W power supply, but my servers didn't come with a power supply, but they came with individual cables. I took a close look at the cables today and looked for values, I found the readings: (10A, 125V) When they're multiplied to find watts, the final wattage is 1250W which is significantly higher than what the ad claims.
I'm apologize if this sounds like a very primitive question, but is there anyone that can point me in the right direction?
It depends on what generation of DL140 you have, and configuration. According to HP's official power consumption chart, with information from the Ebay page, a single server has the following requirements:
Meaning that 20 Servers at 100% load takes the following:
The calculator chart can be found here if your interested.
The chord on the server specifies 10A, 125V which would be 1250W (total wattage?)
Is it possible for just a cable, without a AC brick, to dynamically allocate only how much the hardware requires?
The 10A/125V on the cable is the rated capacity of the cable - not how much power will be consumed.
Let's take the 100% load figure from sk1939; 5720W. That is the amount of electricity that your server farm will consume in one hour with all servers operating at 100% load. That is 137280W per day or 50107200W per year. Electricity is charged on a kilowatt hour (kWh) basis in the US and rates vary widely by area (7-17cents per kWh). In a year at 100% load; your farm will use 50107.2 kWh of electricity. Take that number and multiply it by the cost of a kWh in your area and you'll have a high estimate on power consumption for the servers in your farm. At the US average cost of 11 cents per kWh; that would equal to $5511/year or $459/month.
Be aware that the $5511 figure I provided was based on the US average cost per kWh. In order to get a more accurate estimate of the cost where you are located you need to contact your electricity provider to get the actual price the you will pay per kWh. Multiply that cost by the annual kWh estimate, 50107.2kW, and you will have your estimate. Where are you located in Canada?
Nice! That is around $3106/year. Remember that this cost estimate is only for power related to your servers...any other electricity users will add to your power bill. I don't have the right expertise to advise you on a power meter to monitor/log your power consumption. You might be able to contact the bubbas at this site to get some advice if no one else chimes in on that topic: http://www.powermeterstore.com/?gclid=CPSTnsndzq8CFWYJR...
I've gone ahead and ordered a electricity monitor. So it'll be within a few weeks that I'll get the full picture. But I feel the information everyone's contributed is fairly accurate to what I'll probably expect.
Thanks to everyone for your prompt, insightful responses.
You're welcome...I think you'll find that the power consumption on the farm will actually be a little lower than the estimates we gave. Our estimates were based on all 20 servers operating at 100% load 24/7 every day of the year and I doubt that will actually happen. Good luck and God Bless!