How many workstations in a room?

G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Hi,
I am looking for a norm specifying the space between computers in a single
room. A room (let's say 16 square meters) is supposed to have the maximum
number of workstations.
Is there any RFC about this issue?

Thank you.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Begin <cr1spk$4k7$1@news.onet.pl>
On 2004-12-30, Lukasz Ponski <lponski@wpk.p.lodz.pl> wrote:
> I am looking for a norm specifying the space between computers in a single
> room. A room (let's say 16 square meters) is supposed to have the maximum
> number of workstations.

Technically, and seeing as this is comp.dcom.lans.ethernet, there
used to be _cable_ related minima and maxima, altough nowadays the
minima have gone away. There is no spatial requirement set in the 802.3
standards as far as I'm aware. You will, however, want to keep an eye
on heat and power dissipation caused by the equipment, and make sure
there's enough active cooling to keep the equipment from spontaneously
combusting.


> Is there any RFC about this issue?

RFCs deal with computing issues, and not generally with the human aspect
of labour. What you're looking for is your national norms and standards
institute[1] or, if that's separate, the ministry or department that
normizes and/or keeps an eye on working conditions and work-related
hazards.


[1] You appear to be posting from Poland[2], and a quick google says you
might want to check out http://www.pkn.pl . ICBW, YMMV, &c.

--
j p d (at) d s b (dot) t u d e l f t (dot) n l .
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

In article <cr1spk$4k7$1@news.onet.pl>,
Lukasz Ponski <lponski@wpk.p.lodz.pl> wrote:
:I am looking for a norm specifying the space between computers in a single
:room. A room (let's say 16 square meters) is supposed to have the maximum
:number of workstations.
:Is there any RFC about this issue?

Are you talking about in ethernet terms, or in cooling terms, or in
fire regulation terms, in terms of EMF limits, or in psychological
terms of how closely you can pack people together without them feeling
"crowded" ?

There is no meaningful minimum distance between systems in modern
ethernet (unlike the old 10Base2 days.) I seem to recall that there may
still be a theoretical limit of 1023 stations per segment, but I don't
know if anyone bothers to enforce that, especially considering that if
you are using a modern fully-switched full-duplex topology, there are
only ever at most 2 stations per segment.

The limits to your cooling capacity depend on what you can afford. If
you go to the bother of putting in full raised flooring and a Leibart
unit capable of raising a small tornado, or if you go to the trouble of
water-cooling everything, you can get a lot more units into a small
area than if you have the relatively stagnant air conditions of a
typical bedroom. Manufacturers often give guideance about how much
space to allow around a system, based upon the assumption that the
device will be used in its typical environment (e.g., a home PC in a
convection-cooled home, a business PC in an air-conditioned office
designed to keep -people- comfortable.) The space recommendations for
each model differ. If you have a high-end graphics board or a 3 GHz
CPU, then you are going to need more cooling space than if you have an
older generation of equipment. A CRT usually needs more cooling than a
flat panel display.

Fire regulations vary greatly from area to area, and the most directly
applicable fire regulations might prove to be those governing
electrical circuit densities; that is going to depend upon the local
by-laws and upon the kind of wiring used. Household-quality aluminium
wiring isn't going to stand up to as much heat as industry-quality
copper wiring.

EMF regulations... you'd have to look at the EU directives for that.
Canada [my location] has no regulation corresponding to the EU limits
on low-level EMF. The standard that we use at work is that we only
allow authorized/trained people into past the 5 Gauss line. That limit
isn't because of the effect of magnetic fields on the human body: the
limit is for safety, providing a generous margin to be sure that
nothing the person happens to be carrying is dragged in to the magnetic
field. It is my understanding that the EU does have limits that
are grounded in concerns about the possible effects of low-level EMF
on the human body.
--
Can a statement be self-referential without knowing it?
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Walter Roberson wrote:

> In article <cr1spk$4k7$1@news.onet.pl>,
> Lukasz Ponski <lponski@wpk.p.lodz.pl> wrote:
> :I am looking for a norm specifying the space between computers in a
> :single room. A room (let's say 16 square meters) is supposed to have the
> :maximum number of workstations.
> :Is there any RFC about this issue?
>
> Are you talking about in ethernet terms, or in cooling terms, or in
> fire regulation terms, in terms of EMF limits, or in psychological
> terms of how closely you can pack people together without them feeling
> "crowded" ?
>
> There is no meaningful minimum distance between systems in modern
> ethernet (unlike the old 10Base2 days.) I seem to recall that there may
> still be a theoretical limit of 1023 stations per segment, but I don't
> know if anyone bothers to enforce that, especially considering that if
> you are using a modern fully-switched full-duplex topology, there are
> only ever at most 2 stations per segment.
>
> The limits to your cooling capacity depend on what you can afford. If
> you go to the bother of putting in full raised flooring and a Leibart
> unit capable of raising a small tornado, or if you go to the trouble of
> water-cooling everything, you can get a lot more units into a small
> area than if you have the relatively stagnant air conditions of a
> typical bedroom. Manufacturers often give guideance about how much
> space to allow around a system, based upon the assumption that the
> device will be used in its typical environment (e.g., a home PC in a
> convection-cooled home, a business PC in an air-conditioned office
> designed to keep -people- comfortable.) The space recommendations for
> each model differ. If you have a high-end graphics board or a 3 GHz
> CPU, then you are going to need more cooling space than if you have an
> older generation of equipment. A CRT usually needs more cooling than a
> flat panel display.
>
> Fire regulations vary greatly from area to area, and the most directly
> applicable fire regulations might prove to be those governing
> electrical circuit densities; that is going to depend upon the local
> by-laws and upon the kind of wiring used. Household-quality aluminium
> wiring isn't going to stand up to as much heat as industry-quality
> copper wiring.
>
> EMF regulations... you'd have to look at the EU directives for that.
> Canada [my location] has no regulation corresponding to the EU limits
> on low-level EMF. The standard that we use at work is that we only
> allow authorized/trained people into past the 5 Gauss line. That limit
> isn't because of the effect of magnetic fields on the human body: the
> limit is for safety, providing a generous margin to be sure that
> nothing the person happens to be carrying is dragged in to the magnetic
> field. It is my understanding that the EU does have limits that
> are grounded in concerns about the possible effects of low-level EMF
> on the human body.

And then there is the question of what constitutes a "workstation". If
you're talking CPUs then you can get a rather large number of blade servers
in a single rack. If you're talking terminal or thin client then it's
really more a matter of how many chairs and tables you can squeeze
together. If you're talking standalone PC then it may depend on what
specific hardware you're using--you should be able to get more workers with
laptops in a room than you can with dual-processor engineering workstations
with multiple displays for example.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Thu, 30 Dec 2004 22:27:06 +0000, Walter Roberson wrote:

> If
> you go to the bother of putting in full raised flooring and a Leibart
> unit capable of raising a small tornado

Does anyone still use raised floor? Sounds pretty retro!
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

In article <pan.2004.12.31.02.16.27.560937@pobox.com>,
Erik Freitag <erik.freitag@pobox.com> wrote:
:On Thu, 30 Dec 2004 22:27:06 +0000, Walter Roberson wrote:

:> If
:> you go to the bother of putting in full raised flooring and a Leibart
:> unit capable of raising a small tornado

:Does anyone still use raised floor? Sounds pretty retro!

When we took over control of our building, there were three
full raised-floor computer rooms. One of those was decommissioned
not all that long thereafter, and the other two fell into disuse
for years after the big Vaxen went out of service. We didn't keep up
the UPSes in either of them -- battery maintenance costs were above
$US15000 per year. The air conditioner on one of them was allowed to
die, and the facilities manager is talking about converting that room
into offices.

On the basis of that information, you might be led to think that
controlled environment server rooms were becoming passe', but that
would be a false impression triggered by my selective presentation
of facts. In actuality, the server rooms are becoming *more* important
to us over time.

I mentioned one of the rooms as possibly being converted to offices;
one of our building tenants is very much hoping that that's not true.
The tenant's business as an ISP running server farms is increasing
noticably, to the point where they cannot get enough air flow to their
existing office. They would love to be able to move their equipment into
the controlled environment room.

The other room (the one with a functioning air handling unit) is
getting more and more densely packed with servers. Yes, the old
Leibart UPS is long gone, but we installed a PowerWare unit a couple
of years ago whose -total- cost was less than what a year of maint
on the old Leibart would have been. We put in an 8 KVA unit to handle
our then- measured load of 4.2 kVA, and it has become increasingly
clear that that was a good investment. There's no mainframe or midi
or even "mini" in the room, but we have several self-contained
"mini-rackmount" servers, and we have shelves full of PCs acting
as cluster nodes, and we are in the process of installing another
big set of shelves for our new cluster. The room is getting busier,
and the newer PCs and cluster nodes run hot, so the air handling
is getting a noticably increased work-out. (Which reminds me that
I should investigate to ensure that the air handling unit is going
to continue to be up to the task.) If we didn't have the raised
floors with forced air, then the devices would overheat.

Our computational capacity is not really very big, but if we didn't
have a real server room, we would be forced into something like a
blade architecture at noticably increased per-flop expense, just
for heat reasons.
--
I predict that you will not trust this prediction.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Erik Freitag wrote:

> Does anyone still use raised floor? Sounds pretty retro!

Where else are you supposed to keep your beer cool? ;-)
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

>Erik Freitag wrote:
>> Does anyone still use raised floor? Sounds pretty retro!


I was told that one of the data centers we're building has 5 feet
depth raised floor.



hsb


"Somehow I imagined this experience would be more rewarding" Calvin
********************************************************************
Due to the volume of email that I receive, I may not not be able to
reply to emails sent to my account. Please post a followup instead.
********************************************************************
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Hi Lukasz,

the only ideas I have that could limit the number of WS in a room are:

1. the space used by a WS
2. Current consumption
3. BTUs (British Thermal Units) Heat in other words.
4. Think of a place for the operator too

Happy New Year to you.


Heinz

Lukasz Ponski wrote:

> Hi,
> I am looking for a norm specifying the space between computers in a single
> room. A room (let's say 16 square meters) is supposed to have the maximum
> number of workstations.
> Is there any RFC about this issue?
>
> Thank you.
>
>
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Fri, 31 Dec 2004 03:22:26 +0000, Walter Roberson wrote:

> In article <pan.2004.12.31.02.16.27.560937@pobox.com>, Erik Freitag
> <erik.freitag@pobox.com> wrote: :On Thu, 30 Dec 2004 22:27:06 +0000,
> Walter Roberson wrote:
>
> :> If
> :> you go to the bother of putting in full raised flooring and a Leibart
> :> unit capable of raising a small tornado
>
> :Does anyone still use raised floor? Sounds pretty retro!
>
> When we took over control of our building, there were three full
> raised-floor computer rooms. One of those was decommissioned not all
> that long thereafter, and the other two fell into disuse for years after
> the big Vaxen went out of service. We didn't keep up the UPSes in either
> of them -- battery maintenance costs were above $US15000 per year. The
> air conditioner on one of them was allowed to die, and the facilities
> manager is talking about converting that room into offices.
>
> On the basis of that information, you might be led to think that
> controlled environment server rooms were becoming passe', but that would
> be a false impression triggered by my selective presentation of facts.

There's a big difference between a controlled environment and a raised
floor. My experience is that raised floor either requires
obsessive-compulsive facilities management (with reqard to dust, water
monitoring and cable runs), or quickly devolves into a place to hide
years-old unused cable and equipment. In a short floor (1 foot or less),
you can sometimes pull up a tile and not be able to put it back down
because there's so much junk cable. Older, improperly maintained raised
floor can become a safety hazard because the tile edging comes loose and
is easy to trip over. One facility I worked with after the Loma Prieta
earthquake had to send an operator out on his knees to replace 1/3 of an
acre of popped tile so that the floor was safe to use.

My preference is for telco-style rooms with VCT on the floor, N+1 power
and cooling, earthquake-hardened relay-type racks for easy access to
equipment and cable, overhead cable runs and a "no one enters without a
change control sheet or trouble ticket" policy. This makes the environment
much easier to manage and change and much less subject to accidental or
intentional outages. Sorry if it doesn't please the decorators in the
executive office.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Sat, 01 Jan 2005 02:24:07 +0000, Hansang Bae wrote:

>>Erik Freitag wrote:
>>> Does anyone still use raised floor? Sounds pretty retro!
>
>
> I was told that one of the data centers we're building has 5 feet
> depth raised floor.

I've worked in one facility that had a 3 foot space. Fortunately, they
kept it very clean. After an all-nighter, I sometimes had visions of H.R.
Geiger aliens crawling toward me, hanging from the tile frame.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Erik Freitag wrote:

> On Fri, 31 Dec 2004 03:22:26 +0000, Walter Roberson wrote:
>
>> In article <pan.2004.12.31.02.16.27.560937@pobox.com>, Erik Freitag
>> <erik.freitag@pobox.com> wrote: :On Thu, 30 Dec 2004 22:27:06 +0000,
>> Walter Roberson wrote:
>>
>> :> If
>> :> you go to the bother of putting in full raised flooring and a Leibart
>> :> unit capable of raising a small tornado
>>
>> :Does anyone still use raised floor? Sounds pretty retro!
>>
>> When we took over control of our building, there were three full
>> raised-floor computer rooms. One of those was decommissioned not all
>> that long thereafter, and the other two fell into disuse for years after
>> the big Vaxen went out of service. We didn't keep up the UPSes in either
>> of them -- battery maintenance costs were above $US15000 per year. The
>> air conditioner on one of them was allowed to die, and the facilities
>> manager is talking about converting that room into offices.
>>
>> On the basis of that information, you might be led to think that
>> controlled environment server rooms were becoming passe', but that would
>> be a false impression triggered by my selective presentation of facts.
>
> There's a big difference between a controlled environment and a raised
> floor. My experience is that raised floor either requires
> obsessive-compulsive facilities management (with reqard to dust, water
> monitoring and cable runs), or quickly devolves into a place to hide
> years-old unused cable and equipment. In a short floor (1 foot or less),
> you can sometimes pull up a tile and not be able to put it back down
> because there's so much junk cable. Older, improperly maintained raised
> floor can become a safety hazard because the tile edging comes loose and
> is easy to trip over. One facility I worked with after the Loma Prieta
> earthquake had to send an operator out on his knees to replace 1/3 of an
> acre of popped tile so that the floor was safe to use.
>
> My preference is for telco-style rooms with VCT on the floor, N+1 power
> and cooling, earthquake-hardened relay-type racks for easy access to
> equipment and cable, overhead cable runs and a "no one enters without a
> change control sheet or trouble ticket" policy. This makes the environment
> much easier to manage and change and much less subject to accidental or
> intentional outages. Sorry if it doesn't please the decorators in the
> executive office.

One place I worked just hung a sign on the door--"Danger, inert-gas flood
fire extinguishing system in use--enter this room without breathing
apparatus and you will surely die." They left off the "if it fires while
you're in there". Cut down on the casual visitors quite a bit, especially
since they had to convince the safety engineers that they had a legitimate
need to be in there in order to get issued breathing apparatus.
Unfortunately the safety engineers decided to "fix" it and made them take
the sign down.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Sat, 01 Jan 2005 17:26:41 -0500, J. Clarke wrote:

> Erik Freitag wrote:
>>
>> My preference is for telco-style rooms with VCT on the floor, N+1 power
>> and cooling, earthquake-hardened relay-type racks for easy access to
>> equipment and cable, overhead cable runs and a "no one enters without a
>> change control sheet or trouble ticket" policy. This makes the environment
>> much easier to manage and change and much less subject to accidental or
>> intentional outages. Sorry if it doesn't please the decorators in the
>> executive office.
>
> One place I worked just hung a sign on the door--"Danger, inert-gas flood
> fire extinguishing system in use--enter this room without breathing
> apparatus and you will surely die." They left off the "if it fires while
> you're in there". Cut down on the casual visitors quite a bit, especially
> since they had to convince the safety engineers that they had a legitimate
> need to be in there in order to get issued breathing apparatus.
> Unfortunately the safety engineers decided to "fix" it and made them take
> the sign down.

Great story, but I think the best way to keep out casual visitors is a
lock (preferably a card-key) and a monitored video camera. I think the
regular staff catches on pretty quickly to the "danger - dynamite" signs.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

In article <pan.2005.01.01.21.32.39.152391@pobox.com>,
Erik Freitag <erik.freitag@pobox.com> wrote:
:My preference is for telco-style rooms with VCT on the floor,

Would that be Vinyl Composite Tile? When I was searching for the
definition of VCT, I found a page that claimed that VCT is not
suitable for computer rooms as it does not dissipate static anywhere
near quickly enough (i.e., if you manage to accumulate a static charge,
then VCT is a strong enough insulator that you are going to continue
to carry that charge until you discharge it into the nearest equipment.)
The page said that HPL, High Pressure Laminate, had been developed
specifically to have the right dissipation range for access flooring.


:N+1 power
:and cooling, earthquake-hardened relay-type racks for easy access to
:equipment and cable,

Not much point in earthquake hardening around here -- we are in
the lowest earthquake hazard zone on the continent. Our greatest
natural hazard risk is flood: the last big one (1997) was apparently
wide enough to qualify as an inland sea -- bigger than any of the
Great Lakes. They are saying this year's might be worse.
--
Are we *there* yet??
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Sat, 01 Jan 2005 22:48:28 +0000, Walter Roberson wrote:

> In article <pan.2005.01.01.21.32.39.152391@pobox.com>,
> Erik Freitag <erik.freitag@pobox.com> wrote:
> :My preference is for telco-style rooms with VCT on the floor,
>
> Would that be Vinyl Composite Tile? When I was searching for the
> definition of VCT, I found a page that claimed that VCT is not
> suitable for computer rooms as it does not dissipate static anywhere
> near quickly enough (i.e., if you manage to accumulate a static charge,
> then VCT is a strong enough insulator that you are going to continue
> to carry that charge until you discharge it into the nearest equipment.)
> The page said that HPL, High Pressure Laminate, had been developed
> specifically to have the right dissipation range for access flooring.

No, I meant Vinyl Conductive Tile - excellent anti-static properties. My
apologies for using an ambiguous abbreviation. Many exciting details on
the 3M version at

http://makeashorterlink.com/?I2D13242A


> Not much point in earthquake hardening around here -- we are in
> the lowest earthquake hazard zone on the continent. Our greatest
> natural hazard risk is flood: the last big one (1997) was apparently
> wide enough to qualify as an inland sea -- bigger than any of the
> Great Lakes. They are saying this year's might be worse.

Well, you're lucky you don't have to worry about earthquakes. I also like
my equipment rooms on the highest possible floor. Doesn't work for MPOEs,
though.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

In article <pan.2005.01.02.01.29.36.2287@pobox.com>,
Erik Freitag <erik.freitag@pobox.com> wrote:
>On Sat, 01 Jan 2005 22:48:28 +0000, Walter Roberson wrote:
>
>> In article <pan.2005.01.01.21.32.39.152391@pobox.com>,
>> Erik Freitag <erik.freitag@pobox.com> wrote:
>> :My preference is for telco-style rooms with VCT on the floor,
>>
>> Would that be Vinyl Composite Tile? When I was searching for the
>> definition of VCT, I found a page that claimed that VCT is not
>> suitable for computer rooms as it does not dissipate static anywhere
>> near quickly enough (i.e., if you manage to accumulate a static charge,
>> then VCT is a strong enough insulator that you are going to continue
>> to carry that charge until you discharge it into the nearest equipment.)
>> The page said that HPL, High Pressure Laminate, had been developed
>> specifically to have the right dissipation range for access flooring.
>
>No, I meant Vinyl Conductive Tile - excellent anti-static properties. My
>apologies for using an ambiguous abbreviation. Many exciting details on
>the 3M version at
>
>http://makeashorterlink.com/?I2D13242A
>
>
>> Not much point in earthquake hardening around here -- we are in
>> the lowest earthquake hazard zone on the continent. Our greatest
>> natural hazard risk is flood: the last big one (1997) was apparently
>> wide enough to qualify as an inland sea -- bigger than any of the
>> Great Lakes. They are saying this year's might be worse.
>
>Well, you're lucky you don't have to worry about earthquakes. I also like
>my equipment rooms on the highest possible floor. Doesn't work for MPOEs,
>though.
>


Data Center .....24th floor... Power Failure.... 20 Minute UPS.....

--

a d y k e s @ p a n i x . c o m

Don't blame me. I voted for Gore.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Sat, 01 Jan 2005 20:33:28 -0500, Al Dykes wrote:

> In article <pan.2005.01.02.01.29.36.2287@pobox.com>,
> Erik Freitag <erik.freitag@pobox.com> wrote:
>>On Sat, 01 Jan 2005 22:48:28 +0000, Walter Roberson wrote:
>>

>>Well, you're lucky you don't have to worry about earthquakes. I also like
>>my equipment rooms on the highest possible floor. Doesn't work for MPOEs,
>>though.
>>
>
>
> Data Center .....24th floor... Power Failure.... 20 Minute UPS.....

OK, OK. My spec for equipment rooms runs about 20 pages - I didn't try to
put the whole thing here. If the application was mission-critical, why
didn't you have a diesel generator (tested 4x a year) and spare water
for the HVAC on the roof on the roof?
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Erik Freitag wrote:

> On Sat, 01 Jan 2005 20:33:28 -0500, Al Dykes wrote:
>
>> In article <pan.2005.01.02.01.29.36.2287@pobox.com>,
>> Erik Freitag <erik.freitag@pobox.com> wrote:
>>>On Sat, 01 Jan 2005 22:48:28 +0000, Walter Roberson wrote:
>>>
>
>>>Well, you're lucky you don't have to worry about earthquakes. I also like
>>>my equipment rooms on the highest possible floor. Doesn't work for MPOEs,
>>>though.
>>>
>>
>>
>> Data Center .....24th floor... Power Failure.... 20 Minute UPS.....
>
> OK, OK. My spec for equipment rooms runs about 20 pages - I didn't try to
> put the whole thing here. If the application was mission-critical, why
> didn't you have a diesel generator (tested 4x a year) and spare water
> for the HVAC on the roof on the roof?

You mean the roof 40 stories above? Not that it does a whole lot of good
when someone flies an airliner into your datacenter . . .

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Sat, 01 Jan 2005 22:48:49 -0500, J. Clarke wrote:

> Erik Freitag wrote:
>
>> On Sat, 01 Jan 2005 20:33:28 -0500, Al Dykes wrote:
>>
>>> In article <pan.2005.01.02.01.29.36.2287@pobox.com>,
>>> Erik Freitag <erik.freitag@pobox.com> wrote:
>>>>On Sat, 01 Jan 2005 22:48:28 +0000, Walter Roberson wrote:
>>>>
>>
>>>>Well, you're lucky you don't have to worry about earthquakes. I also like
>>>>my equipment rooms on the highest possible floor. Doesn't work for MPOEs,
>>>>though.
>>>>
>>>
>>>
>>> Data Center .....24th floor... Power Failure.... 20 Minute UPS.....
>>
>> OK, OK. My spec for equipment rooms runs about 20 pages - I didn't try to
>> put the whole thing here. If the application was mission-critical, why
>> didn't you have a diesel generator (tested 4x a year) and spare water
>> for the HVAC on the roof on the roof?
>
> You mean the roof 40 stories above? Not that it does a whole lot of good
> when someone flies an airliner into your datacenter . . .

Yes, that one. You might want to consider moving to an easier location -
maybe an old missile silo or a telco bunker.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Sat, 01 Jan 2005 22:48:49 -0500, J. Clarke wrote:

>>> Data Center .....24th floor... Power Failure.... 20 Minute UPS.....
>>
>> OK, OK. My spec for equipment rooms runs about 20 pages - I didn't try to
>> put the whole thing here. If the application was mission-critical, why
>> didn't you have a diesel generator (tested 4x a year) and spare water
>> for the HVAC on the roof on the roof?
>
> You mean the roof 40 stories above? Not that it does a whole lot of good
> when someone flies an airliner into your datacenter . . .

So without being flippant, this is an interesting problem. When I said
"high as possible" I was thinking Silicon Valley altitudes - 5th or 6th
floor - above the flood line.

Can you tell us anything about your parameters? Sounds like you are in a
building that is 64 stories high (or more) that has been hit by an
airliner. I don't think this could have been one of the WTC buildings. Are
you the only equipment room in the building? Does the building provide any
kind of power/HVAC protection?

Maybe you just meant you had to plan for an airliner hit? That, or a big
earthquake, or a war, or sabotage of some other kind sound like a great
reason to build a business continuity/disaster recovery site. If you have
a DR site, having backup power and cooling may not be a big issue,
assuming you're sure it will work.
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

Erik Freitag wrote:

> On Sat, 01 Jan 2005 22:48:49 -0500, J. Clarke wrote:
>
>>>> Data Center .....24th floor... Power Failure.... 20 Minute UPS.....
>>>
>>> OK, OK. My spec for equipment rooms runs about 20 pages - I didn't try
>>> to put the whole thing here. If the application was mission-critical,
>>> why didn't you have a diesel generator (tested 4x a year) and spare
>>> water for the HVAC on the roof on the roof?
>>
>> You mean the roof 40 stories above? Not that it does a whole lot of good
>> when someone flies an airliner into your datacenter . . .
>
> So without being flippant, this is an interesting problem. When I said
> "high as possible" I was thinking Silicon Valley altitudes - 5th or 6th
> floor - above the flood line.
>
> Can you tell us anything about your parameters? Sounds like you are in a
> building that is 64 stories high (or more) that has been hit by an
> airliner. I don't think this could have been one of the WTC buildings. Are
> you the only equipment room in the building? Does the building provide any
> kind of power/HVAC protection?
>
> Maybe you just meant you had to plan for an airliner hit? That, or a big
> earthquake, or a war, or sabotage of some other kind sound like a great
> reason to build a business continuity/disaster recovery site. If you have
> a DR site, having backup power and cooling may not be a big issue,
> assuming you're sure it will work.

Actually, I was pointing out the absurdity of generalizing without knowing
the details. If you're on the 26th floor of a 27 story building some
things are doable that are not doable when you're on the 26th floor of a 66
story building.

And you can't plan for everything.

--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
 
G

Guest

Guest
Archived from groups: comp.dcom.lans.ethernet (More info?)

On Sun, 02 Jan 2005 06:50:45 -0500, J. Clarke wrote:

> Erik Freitag wrote:
>
>
> And you can't plan for everything.

You're so right, but try telling that to some of the banks I've worked
with. We can always invent a disaster we can't cope with, so you try to
plan around those you can cope with. My feeling is that there are some
events so damaging that you don't care if the network or the business
survives.
 

TRENDING THREADS