Hacker News new | past | comments | ask | show | jobs | submit login
The button that isn’t (nuclearsecrecy.com)
120 points by danso on Jan 20, 2015 | hide | past | favorite | 62 comments



The soviet setup for launch control is similar to the US - decentralised, manual actions all over the place to make it happen.

Visited a soviet silo just outside of Pervomais'k, Ukraine (thoroughly recommend a visit if you find yourself in that neck of the woods, you can sit atop a Satan missile and wave your cowboy hat until a burly Ukrainian tells you to get off before he shoots you) - it's a two-key system, with two buttons, on opposite sides of, well, a really tiny room. While one could reach both buttons and keys simultaneously, the idea was that the keys would be carried by separate people - however in practice, they both hung from a peg on the back of the chair, meaning that someone could have theoretically initiated a launch singlehandedly. Theoretically, because the other fun fact revealed was that most of the time they were nothing like launch ready, and perennially struggled with equipment failure.

Oh, and don't forget that the super-secret launch code for minuteman missiles across the US was... uh, 00000000, for several decades.


>Oh, and don't forget that the super-secret launch code for minuteman missiles across the US was... uh, 00000000, for several decades.

I believe this was the combination to the locks on the launch bunker access doors, not the actual launch codes.

Either way-- laughable security for the military's crown jewels.


> Either way-- laughable security for the military's crown jewels.

The military would disagree. The all-zeros combination code was chosen on purpose because it was felt that having a code at all was a reduction in security since the code revelation procedure became a single point of failure. These aspects of the nuclear security system were designed by politicians, and weren't exactly the best of ideas, so the military did what they needed to comply with the law but keep the nuclear arsenal secure and available.


>These aspects of the nuclear security system were designed by politicians,

This is false. I suggest reading Command and Control; it is illuminating and terrifying.

The setup you described evolved from a tug of war between competing bureaucratic interests combined with ineffective oversight. The various military organizations fought to keep control and funding of prestigious nuclear weapons; the political arm dealt with the paradox inherent in having a system that was ready to be deployed at a moment's notice, but impossible to detonate accidentally.

The compromise that emerged were these broken systems.

To quote from Command and Control:

>After the accident at Thule, the Pentagon had ordered SAC [Strategic Air Command] to impose some form of use control. Instead of relying on PALs, during the early 1970s the Air Force put a coded switch in the cockpit of every bomber that carried nuclear weapons. The switch permitted an arming signal to be sent to the bomb bay when the right code was entered. The lock had been placed on the bomber, not inside the bombs—and a stolen weapon could still be detonated with a simple DC signal. SAC was far more worried about its weapons being rendered inoperable during wartime than about someone stealing them or using them without proper authorization. During the late 1970s, a coded switch was finally placed in the control center of every SAC ballistic missile. It unlocked the missile, not the warhead. And as a final act of defiance, SAC demonstrated the importance of code management to the usefulness of any coded switch. The combination necessary to launch the missiles was the same at every Minuteman site: 00000000.


Indeed, the military felt strongly that it was reasonable to increase the chance of causing Armageddon by accident, in order to hold sacrosanct their ability to cause Armageddon on purpose, and resented attempts to reverse that equation.

It's weird how you say that like it was wise instead of horrifically insane, though.


They generally had great trust in their people, and didn't think the risk of someone deliberately disobeying orders and using a nuclear weapon was high. Meanwhile, they believed that the Soviets were looking for an opportunity to strike, and that the best and only realistic way to prevent a Soviet strike was to present them with certain devastating retaliation. They never planned to cause Armageddon, only have the capability to respond to the other guy causing it in such a way that they'd never want to cause it.

The scary thing about the Cold War was that the participants generally had the best of intentions (aside from a few people like Curtis LeMay, and they never really got presented with the opportunity to express their evil side) but disaster could have easily happened anyway, purely by accident.

And I really shouldn't put this in the past tense. All of this stuff is still there. We all seem to have just decided to collectively pretend that MAD went away when the USSR dissolved, even though there are still thousands of missiles and nuclear warheads ready to wreck civilization at a moment's notice.


> And I really shouldn't put this in the past tense. All of this stuff is still there. We all seem to have just decided to collectively pretend that MAD went away when the USSR dissolved, even though there are still thousands of missiles and nuclear warheads ready to wreck civilization at a moment's notice.

Fully agreed. There seems to be this weird mode of thinking for most people 'I choose to ignore the bad stuff so that I am happier'. But the end result here might be exactly the opposite, the worst case.

For example, it is scary how lightheartedly people talk about 'war with Russia' in context of the current Ukraine crisis. Regardless on what political side one is, avoiding direct confrontation of nuclear powers should be the main concern.

Rather than all the talk and fuss and whatnot about more-or-less meaningful social justice causes, I would rather like to see a movement to put nuclear weapons into 'cold storage' and a verified international protocol to continuously check that they are there.

Seeing that it is unrealistic to have full disarmament, this would at least greatly reduce the risk.


> They never planned to cause Armageddon, only have the capability to respond to the other guy causing it in such a way that they'd never want to cause it.

In that case, they shouldn't have been fussed about an extremely small chance of failure. The Soviets would never dare bank on such a chance, therefore it wouldn't change the effectiveness of the deterrent.

The only reason to act as they did is that they were worried about not being able to get revenge if the Soviets launched first.

I vaguely recall a short story in which the Soviets launch their missiles, for whatever reason, and the U.S. President refuses to give the launch order. MAD has failed and America is doomed anyway; there is no value in adding hundreds of millions to the death toll simply to follow through on a threat.


> They never planned to cause Armageddon, only have the capability to respond to the other guy causing it in such a way that they'd never want to cause it.

This was one school of thought within the command and control structure, but was, from my understanding both a minority opinion and one held more by political oversight members than by the actual generals holding the keys.

Leslie Groves specifically felt that a nuclear exchange was inevitable, and that the US was in a position to win.


It always seemed to me that the military, specifically the Strategic Air Command appreciated that security and ease of use are trade offs, and consistently chose solutions very far to the "ease of use" end of that spectrum. SAC was considerably more worried that bomb security measures would inhibit swift retaliation than they were that poor security would allow for an unauthorized detonation.

More civilized individuals might call that insane as opposed to laughable.


According to this article, the all-zeroes code was on the Permissive Action Links, which are the locks on the bombs themselves:

http://web.archive.org/web/20120511191600/http://www.cdi.org...


You are indeed correct, I'll leave my comment to stand as a testament to my wrongness.

The bunker doors were "fixed" with metal plates after it was shown they could be easily shimmied with a credit card.


It's understandable, there are some things the mind simply doesn't want to believe, and an all-zeroes password on a nuclear bomb easily qualifies.

Do you happen to have a link that discusses the bunker doors? I don't think I'd heard of that one.


This is all discussed in "Command and Control" by Eric Schlosser. I'm not sure if that specific story is available for free online somewhere, but it's hardly an isolated incident of terrible security within the nuclear arms community. The book bounces between two narratives, the first being an in depth look at a specific accident where a rocket exploded in it's silo, the second being a broader overview of the command and control structure around nuclear tech spanning from the plutonium spheres used by scientists in the manhattan project, all the way up to 80's era MAD policy and ICBMs on 24/7 alert.

Another one of my favorites from that book was the EOD tech that put a ready-to-rumble teller-ulam thermonuclear device into the bed of his pick-up truck, drove it off the base, dismantled it in order to show off to a girl, then reassembled and returned it. Although, I can't personally verify the source of that story.


Ever since reading Snowcrash, i've wanted my own personal nuclear umbrella. Not enough to actually do anything about it, of course. It just seems like the ultimate in "an armed society is a polite society."


Thanks. I keep seeing that book mentioned and I'll have to check it out. The story of the guy trying to impress a girl sounds worth it all by itself.


Another interesting bit: "Perl and Nuclear Weapons Don't Mix"

   If fired, they would have missed their destination by thousands of miles.
http://www.foo.be/docs/tpj/issues/vol2_1/tpj0201-0004.html

Note: a maths user error, not a "Perl sucks" rant.


Reminds me of a large particle physics experiment I worked on. One of the detectors had a critical flaw, IIRC the output voltage of the detector modules was proportional to the number of particles passing through it. At every readout, it was reset to 0. Now, if you didn't read out the modules often enough (every few seconds), they would fry themselves. The problem wasn't noticed until the readout system crashed and a couple of modules were destroyed.

When the bug was presented in a meeting, there was one guy who insisted that we install a big red button to shutdown the system. There was even discussion on whether it should have a little lid or not. When that guy left the meeting early, they briefly considered to install a fake button.

In the end, what happened is that they built a "heartbeat" system that just zeros out the modules every second or so. The information in the modules is lost, but since there are millions of collisions each second, this had virtually no impact on data taking. But it prevented damage when the readout system crashed again.

They also made a little GUI in the control room showing a beating heart probably just to calm people down :-).


Interesting. I've observed at a number of telescopes and they all have a big red STOP button that will shut down the telescope immediately. I've only had to use it once. The telescope was poorly designed and it was possible to run it into its hard limit (you could basically run it into the ground). If you do that you have to stop it because the tracking motors will continue to try to drive the telescope into the ground.

The only exception was the Large Binocular Telescope, which is one of the largest telescopes around (even the largest, depending on how you define it). But there a lot more thought was put into the design so you wouldn't be able to run the telescope into the ground even if you tried. (It also helps that they have a dedicated night assistant whose job it is to point the telescope and make sure that the observers don't do anything stupid.)


I am reminded of the opening scene in War Games, when the two officers both need to perform a specific series of actions together to launch the missile. It gave an excellent perspective on not only what was probably the actual process, but the ethical dilemma of one of the officers.

And then there is the heretical "G.I. Joe: Retaliation", where all the world leaders are in one room with their breifcases containing big red buttons.


Funny you mention Wargames. That scene was probably the only realistic scene in the movie. They got that suspiciously good. The one before it, where he arrives at the LCF and it's a disguised farmhouse was idiotic.


In England we have defense facilities disguised as farmhouses. We don't have land-based missiles but this [0] is/was a government bunker from which launch orders could be given. I wondered if you meant idiotic in the sense of unrealistic for America, or as a bad idea tactically?

[0]http://www.secretnuclearbunker.com/


Sorry, I meant the US ones are nothing like that. It was totally inaccurate. In the USA, all the farmers/ranchers know where the stuff is.

"Excuse me, we're looking for Sierra 7." "Yeah, it's down this road for a mile, then left on 127."

And the Soviets wouldn't have been fooled either. So they look like military installations.


It's not about hiding the facilities from locals, it's about hiding them from satellites. Or at least providing enough uncertainty that the enemy doesn't know for certain how many or which installations are where.

More morbidly, in a first strike scenario you want the enemy to waste nukes on actual farmhouses it couldn't tell for certain were launch facilities or not. Then even if no facilities are missed (something you would be hoping for), that's still fewer nukes to be dropped on cities and conventional military installations.


Whatever the reasoning, the USAF didn't do the farmhouse thing. We did the big fence, nasty signs, and heavily armed guards.


There is no point dropping a nuke on a single tactical target. Nukes have a blast range of miles and a regular missle can take our a facility just fine.


That may be true in some but certainly not all cases.

http://www.nytimes.com/2015/01/15/us/roswell-new-mexico-miss...


That's a decommissioned silo. When it was operational, it would have had the fence and usual such around it. The Northern Tier facilities are mostly underground, too. The actual silos (Launch Facilities) look like basically nothing from the top, were it not for the fence and nasty signage starting "Use of deadly force authorized". Also the Southern Tier silos had the launch control facilities integrated with them.


Why is the farmhouse disguise idiotic? I'm pretty sure that sort of thing actually happened -- to disguise the location from satellite surveillance, of course.


Not in the USA, where the movie takes place. Funny enough, I was doing that for a living when the movie came out. I still enjoyed it, although, like the rest of Hollywood's masterpieces, it wasn't remotely realistic.


suspiciously good

They just copied some footage helpfully provided by the air force. (7:50) There must have been a "director's cut" of the same because we see it in "The Day After" where they're turning the keys.

https://www.youtube.com/watch?v=jlPEBROvR9w


Another misconception is that it's impossible to launch nukes without nuclear launch codes U.S president carries with him.

The president's emergency satchel (nuclear football) or cold codes (plastic card called the biscuit) contain codes to identify the president and __authorize__ nuclear attack. They are codes for authorization. They don't contain codes that are needed to execute the attack. There is also two-man rule in place. Secretary of Defense must confirm the authorization.


If you're into this kind of thing, then I cannot recommend the group blog, "Arms Control Wonk" enough!

http://armscontrolwonk.com/

It contains some seriously in-depth analysis of both current and historic nuclear weapons development, with a strong bend toward the diplomatic and political side of things.


The linked blog usually has great in-depth content, but I think this one could be TLDR'ed as "No, for obvious reasons, there isn't actually a big red button that launches a mass nuclear strike. There's a massive infrastructure devoted to making sure they can be launched when they're needed, but won't ever be launched accidentally."

We could at least make an attempt to get into how making a strike would require planning exactly what targets to attack, what weapons to attack them with, and exactly when to launch which weapons, all depending on the political situation and what we're actually trying to accomplish. The idea of a big button on the President's desk to launch a strike is then pretty absurd.


Note that until 1962, there was a single nuclear attack plan that was to be followed in the event of all-out war, with no variations available. That meant that, for example, China would get a full nuclear strike no matter what the political situation was at the time, and if China had declared neutrality, the only options were to bomb the crap out of them regardless, or come up with a completely new plan on the fly, which was just about impossible.

Since 1962, the plans had built in variations, so that e.g. it would be possible to call for an all-out strike with a hold on China, and all the participants knew what to do for that case.

But still, all the planning was done in advance. US nuclear war planning assumed that we'd be retaliating for a strike on us, and thus time was of the essence. Strategic Air Command had a goal of getting all of their strike bombers airborne within 15 minutes of a warning, and they could do it. There was, of course, no actual big red button, but conceptually it was pretty close. Ordering an attack would have been a matter of authenticating the order-giver, and saying, launch the pre-planned attack.

Edit: I should probably link to Wikipedia's treatment of said plan: http://en.wikipedia.org/wiki/Single_Integrated_Operational_P...


Thanks, that's actually some interesting info. I was under the impression that the SIOP was a list of pre-planned strike patterns/plans, and whoever had the authority to order a strike would select one based on the political circumstances. Is that not the case?

It's also a little surprising, though believable, that there was only one dedicated emergency attack plan up to 1962.


Wikipedia says that the post-1962 version had five options:

1. Soviet nuclear missile sites, bomber airfields, and submarine tenders.

2. Other military sites away from cities, such as air defenses.

3. Military sites near cities.

4. Command-and-control centers.

5. Full-scale "spasm" attack.

So, sounds more or less as you describe. The first one apparently didn't have anything like that.


That tl;dr is sort of wrong though - it's better tl;dr-ed as:

tl;dr - an excuse to talk about the history of "button as a metaphor"


I thought there was some russian commander who avoided a direct command to launch missles, and it turned out the command to launch was a mistake made by some other russian entity relying on faulty data. Seems as close to a "button" scenario as one could've had.

edit - point being that, at least in this case, in 1983, it contradicts the claim in the article that these "systems aren't centralized", and "requires a lot more activity, spread out across a vast geographical area". I'm sure it's better in current times.

*edit - oh, I thought he had the power to launch himself - I guess it changes slightly if his job was only "to report" up the chain of command.


There was also the B-59 incident during the Cuban Missile Crisis[1], when a Soviet sub -- its officers exhausted, out of contact with Moscow, and uncertain if war had broken out -- almost launched a nuclear torpedo against US forces.

XO Vasily Arkhipov, who had also been XO on the infamous K-19, was the only officer to oppose launching the sub's nuclear payload, and since launch required positive agreement from the Captain, XO and political officer, Arkhipov literally stopped World War Three by himself.

[1](http://www2.gwu.edu/~nsarchiv/NSAEBB/NSAEBB75/)


I wonder, are there any stories where the "Political Officer" is actually the hero rather than the villain?


Well, I'm sure the Soviets printed quite a few. :) Actually, the role of the Soviet political officer (zampolit) was by then subordinated to the commanding officer -- political officers were expected to function as field officers (pilots, artillery officers, and so on), in addition to political education, morale-raising, and individual counseling. They also had specific authority over Communist Party members within units, who basically operated something like senior NCOs. In practice, I think, the zampolit system operated relatively smoothly -- the Red Army was never able to get rid of the political officers (Zhukov tried, and even he couldn't do it!), but they were able to effectively defang the position by being orthodox Communists themselves, and by insisting that political education not compromise practical competency.


>not compromise practical competency.

There's a quote from The Last Sentry[1] regarding this:

"On major combatant vessels, the political officer was third in command, following the captain and his starpom (short for stariy pomoshnik, or senior assistant), who was the equivalent of an executive officer in the U.S. Navy. The zampolit was required to qualify as an underway watch officer like any other officer on the ship and so had to have some operational competence."

I'm pretty sure that training and working with the regular crew would weaken the dedication to "The Party" over time.

[1]: https://books.google.com.au/books?id=pThlAgAAQBAJ&lpg=PT15&o...


IIRC in the past, there were multiple occassions in which a "button" being pressed in error might have started nuclear war. But I think the guy you are talking about is Stanislav Petrov[1] who is known for preventing nuclear war in 1983[2].

[1]: http://en.wikipedia.org/wiki/Stanislav_Petrov [2]: http://en.wikipedia.org/wiki/1983_Soviet_nuclear_false_alarm...


Its not entirely clear that Petrov prevented a nuclear war. His job (which he choose not to do) was to report detected missiles up the chain of command. In theory, the Soviet Union policy was to require multiple-source warnings before retaliating, in which case they still would not have launched anything. In practice, they very well might neglected to follow this procedure and proceed to retaliate.


I remember an interview (will try to find a source) where they told an anecdotal story about Petrov.

Some years after the incident, Petrov told he believed that the system at his station was showing a false alarm and wanted to wait for a confirmation from other stations. He is cited with not wanting to sting a bee-hive.

After being introduced the information present at the other stations (most if not all were seeing the same false-alarm) he told the interviewer that if he would have had this "information" at the time of the incident he'd probably decided otherwise.


That's the Petrov story, which has legs, but doesn't seem very important to me. Essentially, he was using some new equipment and saw 5 incoming missiles. He decided against acting upon that information as he saw it, rightfully, as a bug. He may have gotten into some trouble with his superiors, but I imagine that has a lot more to do with the face-saving bureaucracy of the USSR than a readiness to launch with such limited information.

There must be a mountain of research into false positives of launches and more false positives than we know. After the fall of the USSR we found that that this happened practically regularly. I believe the Soviets saw the Able Archer exercises in 1983 as a potential attack and even detected missile launches. We also found out that Castro was ready to start nuclear war until his Soviet masters explained to him what that really means.

This is also why nuclear launch approval tends to be from the head of state only, instead of an automated process or via lower level delegates.


Did anyone ever actually believe that there was a fire-all-ze-missiles button? Even without specific knowledge, a reasonable person must realize that such things have to consist of complicated logistics and multiple levels of redundant failsafes both human and machine.


>Did anyone ever actually believe that there was a fire-all-ze-missiles button?

Well, a lot of us believed that there were only two possibilities for nuclear war: either no missiles got fired, or they all did -- either in an escalating pattern of strike and counterstrike, or in one big "let's knock them out before they get us" attack.

So I guess a "big red button" is as good a metaphor as any for the sort of on/off, all-or-nothing situation we were afraid of.


A strategy that nuclear strategist Herman Kahn (http://en.wikipedia.org/wiki/Herman_Kahn) memorably described as a "wargasm."



Where's this from?


Land of confusion by Genesis.

https://www.youtube.com/watch?v=1pkVLqSaahk


There is also rampant cheating by the people being trained to be in charge of nuclear weapons, both air force and navy:

https://www.google.com/search?q=nuclear+cheating+scandal

Nice warm comforting thought.

Even better is how they solved cheating - they stopped giving them grades:

http://www.npr.org/2014/07/28/334501037/to-stop-cheating-nuc...

Yes, those are the people in charge of nuclear weapons, how safe do you feel now?


Unchanged.

The job required to fire a missile is somewhat prestigious but relatively simple - the command comes in, execute a pre-planned sequence of events that have been drilled into you by that point.

At that point it's no different from following the assembly steps for a hamburger at a fast food restaurant.. though with much greater consequences than a stomachache afterwards :)


If they don't have the morals not to cheat in the MILITARY, what makes you think they are trustworthy to consider the consequences of pushing the button on a nuclear missile?

Or are they just going to belittle the "enemy" in their minds like they are taught and then an hour later twice the number of nuclear warheads come crashing down on us from overseas in return.

Not their problem because they were "just obeying orders" and they are under a hardened site.

Kind of like how armed drone operators don't think about what they are are doing until months later when they get PTSD when they find out they killed innocent people


The problem is that none of those people in the silos expect they'll ever be called upon to actually launch those missiles. Ever since the Soviet Union collapsed, they've been a weapon without a plausible mission.

This has two pernicious effects. First, it leads the people in the silos to think to themselves: if none of this really matters, why not cheat? It's all theater anyway, right? Why put your career prospects at risk over something that's pure theater?

And second, it means that the best and the brightest Air Force servicemembers, the ones who want to wear stars on their shoulders someday, don't want to serve in the silos to begin with. Promotions in the military flow from service in real wars, not imaginary ones. So the people who end up in the silos are the ones who can't get stationed somewhere (anywhere!) else -- and that's exactly the type of person who's going to be tempted to cheat to get ahead in the first place.


>If they don't have the morals not to cheat in the MILITARY, what makes you think they are trustworthy to consider the consequences of pushing the button on a nuclear missile?

The military is not interested in your morals, they are interested in your following the orders that you are given. Teamwork breaks down when people question orders, and timing matters during operations.

Yes, there are a small handful of cases throughout history where a lone solider disobeyed / questioned orders and came out ahead. I can't know for certain, but I'm willing to bet that many, many more soldiers who've done this either got their squadmates killed, lost their ass afterwards, or both.


Meta: automate things that are frequent, not things that are risky; hence: no WOMPR, Wing Attack Plan R or "Big Red Button."


Am I the only one that immediately remembered radiolab's Buttons Not Buttons episode [1] ? They had a nice bit about the "cut the heart out" part and how it contrasts with how easy it is today to start a nuclear attack

[1] http://www.radiolab.org/story/buttons-not-buttons/


It's mentioned in like the third paragraph of the article...


Why did the title get changed? It used to be the article title "The button that isn’t"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: