A few days ago, a couple of people in my company (myself included) met to discuss what systems we need, and when, should we experience an emergency. The person who lead the meeting made a critical mistake: he thought the conversation would follow a logical tack.
There's a Dwight Schrutish guy who was included in the meeting, and he kind of took us off course. The Leader* asked us what systems we would need, figuring that we would discuss when we'd need them as we outlined what constituted an emergency. So, he posed the "What do your departments need" question, Dwigh spoke up.
"Well," he said, "first I think we need to assume that if there's a nuclear attack in Washington, then no one's coming here. We'll probably need a site that's geographically far away."
I kid you not. He casually mentioned the possibility of nuclear holocaust. Me? I was thinking about a multi-day power outage a la 2003. Not World War III. I can promise you, emphatically promise you, that if a nuclear bomb is dropped on the nation's capital, I'm not going into work, and I'm really, really not caring about how long it takes us to get our systems back on line.
Moments later, Dwight took it down a notch. He moved on from the Baltimore/Washington Metro Area suffering from nuclear fallout to our building crumbling down around us.
"I think it's naive of us," he said, "to assume that, if this building collapses, it'll be while none of us is here. I think we should realize that everyone on the 5th floor is just gone, and we're probably gone too."
We were all slightly horrified. We were talking about the disaster recovery of systems. It should be a cut-and-dried, mechanical kind of thing. Not a grisly decision tree. "If X and Y die in a car accident together, what do we need to do to keep the business running?"
Yeah, so, if my part of the country is wiped off the map, never fear, customers o'mine! We've got it covered.