I visited a Equinix data center in San Jose and it was an incredible experience. The redundancies and failsafes they have in place are mind blowing. No two doors to the same room can be open at the same time. All doors use handprint readers.
They have a rack of big batteries that fill up a huge room that go on automatically in case of power failure. This will last about an hour until a 10000 HP generator on the roof starts and powers the data center for a week.
Typically the operators will have an agreement so that they can get the generators refueled should the outage last even longer than what fuel they have on premise, so they theoretically could run indefinitely on generator power.
An outage requiring generator power for longer than 1 week might not allow transportation of fuel. Just want to bring that up in case people really did think you can run "indefinitely" on generator power if the roads, bridges, and gas supply network is out, or if there's an emergency state of war or terrorism.
At the 1wk+ territory, it will start to look something like Katrina, where the roads were all blocked by flooding waters and fuel resupply missions required the protection of the National Guard.
I mean you can charter helicopters to bring in fuel, but I think there are more urgent things to power than a datacentre in those cases, for example, hospitals and shelters for emergency humanitarian aid.
> At the 1wk+ territory, it will start to look something like Katrina ...
This reminds me of a blog that was live-updated during Katrina by a data center operator: [1]. They managed to keep everything running through the whole storm, IIRC, but it was pretty dicey -- moving barrels of diesel by hand, staying on-site and surviving with water rationing, losing all but one upstream fiber, etc. Fascinating reading in hindsight.
Where I used to work we had refuelling agreements with multiple vendors. Yes, hospitals and critical infrastructure are prioritized first, but we ran on generators for an extended spell during the Northeast Blackout of 2003[1] (the outage affecting our facility only lasted a couple of days, but due to the utilities' appeal to reduce load on the grid until it was fully functional we decided to run on generators for over a week).
Edit: One of my fond memories of that time was actually greeting the refuelling truck and directing it to the point where he could plug a hose into a pipe on our building and start pumping.
If terrorism, what stops someone from renting cabinets, bringing in "servers" loaded with explosives, and blowing the place up? Or even a molotov thrown at some of those big fibre entrances. Sorting that out, I'd imagine, would take a while.
Sometimes it feels like the opposite. Any day that has the slightest hint of holiday to it is "markets closed! remember no work tomorrow! hooray for national silly lala day!"
These days, I'd hope they can failover to standby hosts in "IL#" or "VA#" or another datacenter in a nearby region.
The DR/BC space has become pretty robust, and with the right vendor, a datacenter loss can be recovered from simply by powering on fully-replicated VMs in another datacenter.
Heh - I remember back in '01 or '02, being let into the PAIX DC and left alone for around 90mins with a backpack full of hard drives and a toolkit.
It was the cheapest/fastest way for us to get our "big" database from Sydney to the US - hop on a plane with about 9kg of hard drives as carry-on. Never underestimate the bandwidth of a 747 full of 340 Megabyte hard disks ;-)
They have a rack of big batteries that fill up a huge room that go on automatically in case of power failure. This will last about an hour until a 10000 HP generator on the roof starts and powers the data center for a week.