Hacker News new | past | comments | ask | show | jobs | submit login

My experience in the Navy made me lose all respect for the "leadership" there. It's a place where careers are established by doing everything except that which involves any actual thought or problem solving.

The culture is risk-averse to a fault, but at the same time have nearly zero capability to accurately asses risk or fix problems beyond punishing people when things inevitably go wrong.

Official "risk assessments" are performed before every dangerous undertaking, with numbers that are pulled out of someone's ass, along with additional "risk mitigation" numbers that are also pulled out of someone's ass so that leaders have cover when an accident happens. "We performed all the necessary risk assessments and were still a go."

Let me give you an example. Underway replenishment is when one ship will pull alongside a supply ship and transfer fuel and supplies from the supply ship. It's very dangerous because the ships are within 300 feet of each other for hours on end.

A risk-assessment is done and briefed the night before. Those are a PowerPoint presentation with a grid of things that can go wrong on the Y-axis. There are two columns, "risk" and "mitigation," and in line with each disaster are the probability conflated with the severity of that happening. Not actual numbers, just words like, "severe," "marginal," "likely," etc. Those are color-coded between red and green.

One of those things is always "crew preparedness" or similar. The risk of having an unprepared crew that didn't get a lot of sleep is shown as, for example, "severe" and colored red. The briefer will say, "well this is a severe risk, so we're making an announcement that everyone needs to get plenty of sleep tonight and that mitigates that risk down to "tolerable."

Actually this is more technical than what actually happens most of the time, which is that you'll hear things like, "we've mitigated the yellow risk down to a green." The entire ship's company of NCO's and officers will watch that presentation and nod their heads along. There is never one of these that results in a decision not to do something.

And there you have it. I've taken you through the entire risk mitigation strategy for the Navy, that happens nearly every day.

Does anyone know the actual probabilities of disaster? If so, they're not saying. Are they changing existing procedures or spending additional money to fix problems based on actual numbers? Hell no.

Really severe problems are dealt with in the following manner; a "zero tolerance" policy for that problem existing will be created, and when disaster strikes, it's a career-ending event for whatever scapegoat is standing there. Problem solved!

By the way, don't join the Navy.

EDIT: Also, the "too much technology" portion of the article is laughable. Part of the problem is that technology that the rest of the world uses to avoid these sorts of things is seen as anathema to the luddite SWO community, and rather than embrace anything that could help automate error-prone tasks, they pay lip service to it and continue doing extraordinarily stupid things for tradition's sake.

Aircraft carriers are driven by issuing voice commands to (mostly undertrained) helmsman in an environment where any additional noise could result in a misunderstood command and disaster. Or the helmsman gets tired of standing for a five hour watch every day and accidentally turns left instead of right like she's told. Imagine driving a car by telling the driver what to do from the passenger seat. Not just a "turn left up there," but "turn the wheel 15 degrees to the left, ok, now turn the wheel back to center, ok, now turn it back three degrees to the right because you oversteered..." It's insane.




> Aircraft carriers are driven by issuing voice commands to (mostly undertrained) helmsman in an environment where any additional noise could result in a misunderstood command and disaster. ... Imagine driving a car by telling the driver what to do from the passenger seat. Not just a "turn left up there," but "turn the wheel 15 degrees to the left, ok, now turn the wheel back to center, ok, now turn it back three degrees to the right because you oversteered..." It's insane."

Former aircraft carrier OOD (and nuke SWO) here. I agree with your comment about fatigue, but did you get much bridge time? At least back in the day, the conning officer's voice commands weren't so micro-managey. They were along the lines of, "Right 15 degrees rudder [0], steady on course 090," which the helmsman would repeat back to make sure the command was correctly understood. The helmsman would then turn the wheel as needed to make the turn and steady up on the ordered course.

[0] The conning officer would specify the rudder angle in part to control how much the ship would heel in the turn. That can be important when you have armed jet aircraft driving around on the flight deck.


I spent nine months as the conning officer on an aircraft carrier and qualified as OOD. On numerous occasions we narrowly averted disaster because the helmsman misunderstood the command, or repeated back the correct command but did the opposite.

In many cases those helmsmen were undertrained or tired, as a direct result of cuts in manning/funding and additional requirements.

The "old" people in the navy, including many of the admirals at the time (ten years ago) had many of the same opinions as those expressed in the article (they need to stop looking at radar!!!)

What those people didn't seem to realize is that we were doing the same things they were doing 20 years prior with about half the crew they had.


Spot on. In the Coast Guard we have a similar operational risk assessment model called General Assessment of Risk (GAR)[0], which uses six categories and a score of 1-10 in each to get a cumulative score, which corresponds to either Green (low risk), Amber (moderate risk), or Red (high risk). In a best-case scenario, all involved in the operation will conduct a GAR brief as a group, and the person doing the brief will solicit the group for what number they would give for each category, and the highest number anyone calls out is the one the whole group goes with. Typically, if somebody calls out a 5 or higher, they are asked to explain why they feel the risk in that category is so high. Once the reasons are identified, the group then "mitigates" the risk by discussing the identified reasons so that everyone understands the risks. The score is tallied, and the color (risk level) is identified.

That is the best-case, textbook way to run the model. Still very subjective pseudo-science, masquerading as objective risk management, but at least has something of a method to it.

In practice, the repetitiveness of the GAR model results in many crews blowing it off, and giving a vague, arbitrary cumulative score without any discussion around how they got there. This sounds bad, and by policy it is bad, but in practice I have observed no discernible difference in how crews approach risky missions and operations when they conduct a full GAR brief or just give a somewhat random score and move on. In other words, the GAR model does not seem to provide any tangible risk management benefit, and largely seems to serve as a bureaucratic CYA solution.

GAR was introduced in an attempt to reduce the number of mishaps occurring due to what was deemed to be excessive risk-taking. The statistics may demonstrate that it had that effect, though I would argue from my perspective that other training programs introduced to address problems related to risk assessment have far more deckplate-level impact and effectiveness. The problem with GAR is that it tries to objectively standardize something which is, by its very nature, dynamic and subjective. No two people, in the exact same situation and having the exact same capabilities and experience, assess the risk the same way. Assigning numbers to a series of broad categories and giving a color-coded risk level to the situation does not inform anyone of anything very useful. Discussion of risk factors is more helpful, but due to the way the system is structured, is a step frequently skipped.

What matters far more is focusing on continuous, dynamic training and education of those in billets for whom risk management is a critical part of the safe completion of their mission, and emphasizing clear communication unfettered by rank or positional authority to ensure that everyone has full situational awareness. Be respectful, but make sure that information can move freely between all involved.

[0] https://www.uscg.mil/hq/nsfweb/foscr/ASTFOSCRSeminar/Present...


Shout out to a fellow Coastie.

I think what I learned from my superiors is that it doesnt matter so much what system is used for risk assessment, but whether a conversation about risk was had in a meaningful way. Sometimes quantification helps this, where complex systems can be analyzed and the consequences assessed. Usually if you are going to do that it has to happen well in advance of an operation. Othertimes the desire to get a number leads to a real ham fisted attempt to "quantify" things like how fatigued the crew is on a scale of 1-10, or to rate the environmental conditions. When the GAR model, which stands for Green - Amber - Red, was used to facilitate an honest conversation about how people felt about an operation as opposed to just checking a box so that it could be put in the logs. When I saw it being used as the former it absolutely made things safer, but that was very dependent on the attitude of those participating.


Agreed, and a more succinct way of saying what I was getting at. Attempts to quantify inherently subjective attempts aren't very productive in their own right. What matters far more is having a substantive conversation about risk, and work to promote a culture of open and honest communication.

Semper P


If you want to know how tired your crew is, wouldn't it suffice to pluck a random few of deck and ask them if they're tired as fuck?


How would you make them answer truthfully? Admitting to being tired either means you're showing personal weakness, or you're calling your superiors incompetent at scheduling. Neither possibility sounds good for your career. I think it would be better to measure reaction times, eg. by making them catch a falling ruler immediately after it's dropped and seeing how far it falls.


This is why good leaders create an environment where you can be honest about such things. You are basically talking about an inhumane system led by fear. Then again we are talking about the navy here...


This is part of the reason why they institute fatigue standards, to try to keep people from being over tired and still working, which led to people being hurt.


The hard part is that no one wants to admit that they are tired. Everyone wants to be "that guy" or "that girl" who is always ready to go. After all, most people Ive met in the military have a hard time putting themselves to bed because they have the next watch when its 0200 and we're pulling migrants off rafts or running helicopter ops to interdict $100 million worth of cocaine. During lots of operations you will have a lot of the crew stood up to run things, so balancing crew fatigue with surge operations is no trivial task. The SWO community could take a lot of pointers from the aviation folks - they are way better about making sure people get mandatory rest.


In your opinion, are high level commanders well educated? field experienced ?


Well-educated...it depends. Many have impressive sounding degrees. However, an extreme anti-intellectual bent is very common.

Do you know the people who studied for and aced all exams through brute force memorization, promptly forgot all the material or never understood how it would be applied, but now have the certificate that says they know X? That was 90-95% of Naval officers.

I had a one week class in digital communication, as part of a nine-week introductory class for new officers. They go over basic information theory as it applies to naval communication systems and encryption. The valedictorian of that class, who had just aced the test on digital communications the week before, raised her hand the following week to ask a clarifying question about a follow-on topic: "When you say everything is just ones and zeros...what does that even mean?"

That sums up naval training. I realized then that this person, despite all appearances of knowing the material and being an excellent student, had NO clue what anyone was talking about and hadn't for the entire class despite having the highest grade of anyone.

So you'd have an aircraft carrier CO with a Master's degree in chemistry who didn't really understand the concept of vector addition as it applied to calculating relative wind. He'd be yelling at subordinates for not being able to calculate (improperly) wind vectors and distances. All subordinates were clearly confused by this, felt like the CO, being highly educated and the CO of the ship, must know what he's talking about. If you'd ask the CO to explain what he meant he'd become irate and tell you you were stupid.

The nuclear officers/enlisted knew their shit, and those were some of the smartest and best people I've ever worked with. They'd nod their heads at the aforementioned CO and then ask, "he really doesn't understand vector math, does he?" when he was out of earshot. The rest of the crew would assure the nuclear guys that they must be wrong.


The valedictorian of that class, who had just aced the test on digital communications the week before, raised her hand the following week to ask a clarifying question about a follow-on topic: "When you say everything is just ones and zeros...what does that even mean?" That sums up naval training.

That sums up a big chunk of computer science. What answer did she get from the instructor?

It's a good question, coming from a non-technical person, and I'd definitely give a dirty look to anyone who laughed at it.


> I'd definitely give a dirty look to anyone who laughed at it.

Perhaps that would be appropriate before said person had taken a digital communications class and aced a test on it. After that, however, I think a wry chuckle is very much appropriate.


It's not a good question from someone who aced a class in digital communications. What did the guy think was being communicated?


So the guy accumulated all the pieces of the puzzle before he could assemble them? It's not efficient,but it works as long as you don't get laughed down for finally grounding your facts in something. Sometimes you just have to keep making forward progress, and put things together later.

He just unpacked the black boxes in a different order.


not OP but gonna answer from my experience...

1)...at what? Many of them are very bright and well trained, but seriously, at what? People who go into leadership are often strong managers but that doesn't make them good captains or good seaman. They are incredibly skilled, just at the wrong thing. You gotta remember that at the end of the day it is likely someone under the age of 22 with their hands on the wheel and people of similar age running all the major systems. The promotion path to captain is not exactly hands on.

2)...oh yeah. They have been serving on ships for a long time. But in the Navy, serving on ships does not in any ways equate to seamanship. The Navy (and obviously IMHO) has a little bit of a chip on its shoulder about 'blowing shit up' that likely comes from feeling underutilized in our last to endless wars, and this is especially prevalent in the surface fleet. They tend to prioritize things like warfare officers and sometimes even ex-pilots for promotion up towards captain ranks. Those people are absolutely deployed but doing things totally unrelated to navigation and seamanship as a core mission. They should, but often don't have the time or space, have impeccable seamanship skills.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: