Hacker News new | past | comments | ask | show | jobs | submit login
What can doctors learn from pilots and cyclists? (bbc.co.uk)
61 points by Silhouette on April 1, 2016 | hide | past | favorite | 48 comments



> [Aviation] professionals are given every reason to cooperate, because they know that investigations are not about finding scapegoats, but learning lessons. Indeed, professionals are given a legal safe space so they speak up honestly(...)

This is probably the single most important thing that aviation got right and I don't think people understand that well enough, in general.

The attitude in the air industry, when something goes wrong, is 1) understand what happened; 2) figure out what needs to be changed so it doesn't happen again. There is no step 3. There especially isn't a step of "punish those responsible" or some such nonsense.

Only the final goal of improving outcomes matters, and indulging the emotional human demand for punishment does not necessarily help that goal, it can even seriously hinder it. If only we could apply the same rationality to, say, the penal system.


This is very true. The FAA really takes this seriously. So much so that since one of their duties is enforcement and potentially revoking licenses for violations they have built a system where NASA, a completely separate agency, manages the Aviation Safety and Reporting System (ASRS).

When we, as pilots, submit ASRS reports of mistakes or dangerous situations we know that the information we submit on those reports cannot and will not be used against us in a FAA violation proceeding. Furthermore, submission of an ASRS report will often reduce or eliminate any penalties for an infraction. This has created a cover-your-ass-culture that actually promotes safety as we will often report mistakes and dangerous situations early and aggressively, knowing that we are protected from FAA action and for commercial pilots, from airline disciplinary processes.


This system is working great. But what about bad pilots ? How are they dealt with ?


I'm not a pilot, but I'd imagine that it's a combination of recognizing and learning from each mistake and a ruthless punishment of severe mistakes.

If you have a culture of reporting each and every mistake, you can receive feedback on what you could have done better in each and every one of those situations. That leads to bad pilots constantly becoming better pilots. Additionally, unlike the doctors in the story, severe mistakes in aviation will result in the death of the pilot, so like those startups that always fire the bottom percentage of their workforce, the really bad pilots will naturally eliminate themselves.

Rockclimbing is similar to flying in their compulsion to study fatal accidents. Climbing magazines will publish fairly in-depth discussions, not out of morbid curiosity but, instead, as a campaign to build awareness among climbers of mistakes that can kill them. This has led to safer gear and practices used in the sport and has undoubtedly saved lives.


By definition, the bad pilots have or will soon crash their aircraft, removing either themselves or at least their equipment (and unfortunately maybe their passengers) from circulation.

I say this as myself a bad pilot, but one who is aware enough of that fact to always fly with dual control.


There is a separate FAA process for willful violations. Airlines also have their own internal processes.


It's also fair to point out that this is not a universal system. I'm pretty sure Italy by default treats aircraft accidents as crime scenes where responsibility is to be assigned.


The FAA has also recently said that it's going to change its focus on enforcement through punishment and aim for a reduction of violations through education and training.


We (try to) have a similar attitude about mistakes at our manufacturing facility. Even if the problem is employee did not follow instructions, the real problem is that the instructions were complicated, or they were only given once, or we don't have a check in place to verify the instruction was followed. Blaming people doesn't fix things.


That's a great example. The goal is to modify behavior, and to do so you need to understand the conditions that lead to the undesirable behavior and change those conditions. Barring outright intentional malice, there's usually a good reason someone did (or did not do) something.


I'd phrase it as "to modify future behavior", to emphasize that absolutely nothing you do can change what happened.


Yes, exactly! I think that's sort of the key thing about the aviation approach. You can't change what happened so there's not much point in blaming people. Assigning blame and punishment is one way of attempting to modify future behavior, though not always effective.


I agree completely that we tend to focus too much on punishment rather than preventing future harm. That said, one thing that makes the airline industry unique is the binary nature of problems. If something goes wrong and the plane doesn't crash, then there's no harm to passengers so it's easier to forgo punishment. If something goes wrong and the plane crashes, there's no one left alive to punish. Add to that the fact that there is no possible nefarious reason for a pilot to accidentally crash a plane. A pilot can crash a plane on purpose, but if they do something stupid and the plane doesn't crash, they're likely to be believed when they say it was just a stupid mistake. Compare to a doctor who doesn't suffer when a patient suffers (at least not physically) and in fact may even save money from certain oversights. It's much easier to assume bad intent where mistakes are profitable rather than fatal.


You're right that there's a natural alignment of incentives between pilots and their passengers, but as wnevets points out above there are other critical participants in the system that aren't travelling in the plane.

Your "binary nature of problems" comment is wrong though. First common misconception: statistically the most probable outcome for someone involved in an airplane crash is survival, not death [0]. Second misconception: investigations routinely happen for incidents not resulting in crashes. In fact the whole point of promoting self-reporting of incidents by pilots is so that such investigations can be conducted for incidents that would otherwise never be known.

[0] https://app.ntsb.gov/doclib/safetystudies/SR0101.pdf


That's a very excellent point about other critical participants.

I appreciate the additional information! I didn't mean to sound like I thought that people die all the time in airplane crashes, merely that the median mistake is way less likely to affect a passenger, which is a major criteria for deciding to punish in other professions. Every mistake a lawyer or doctor makes can hurt clients or patients, respectively, whereas for most mistakes a pilot makes, no one will even know they made it (except if they self report, which they do a lot as you pointed out). And then I added in the ill-advised part about dying, by which I was really trying to communicate that the really bad outcomes are so bad that punishment is the last thing on people's minds.


> punishment is the last thing on people's minds

I wish. Just off the top of my head, there was the swiss air traffic controller who was murdered by some relative of one of the victims of a mid-air collision [0], and the Brazilian government trying to put the surviving crew of another mid-air collision on trial [1]. Someone with better memory could surely think of more examples.

[0] https://en.wikipedia.org/wiki/%C3%9Cberlingen_mid-air_collis...

[1] https://en.wikipedia.org/wiki/Gol_Transportes_A%C3%A9reos_Fl...


Pilots aren't the only ones who can make a deadly mistake in aviation. If a traffic controller or maintenance personnel makes a mistake, are they punished?


Sometimes, it depends on who notices the mistake and why.

Consider, a safety first culture can easily penalize someone for showing up to work drunk IF someone notices before a problem shows up. But, if nobody notices till after the fact then they should be free to admit it without problems. Because there where two issues, they where drunk AND nobody noticed. You can't fix the second issue if you don't know it's a problem.


Absolutely, especially in the case of ATC. Now, the punishment is often "merely" decertification to work their position and re-training (with pay) until they're fully certified to work their position again, but it's still a pretty big deal to the ATCers I know.

Maintenance personnel can also have their certificates suspended, though that tends to be more rare than being sued. (ATC is prevented from being sued in almost all circumstances, so that doesn't apply to them.) You can be sure that most fatal crashes result in at least an investigation of logbooks, who signed off the aircraft and for what, and some utterly absurd civil cases, many of which are settled without regard to the facts of the case but based on the perceived depth of pocket of the defendant.


not exactly an aviation example, but applicable: the space shuttle Challenger tragedy was not caused by pilot error.


Yes and Columbia.


One thing I've learned in my career: Whenever there is a failure, it's almost always caused by more than one mistake. The root-cause could be someone not paying attention, but if that mistake can be disastrous, you shouldn't be relying on that one-point of failure. If "finding someone to blame" is the solution to a problem, your org has failed in more ways than one.

I've had mixed experiences, but I find the orgs that set up roadblocks to prevent single-mistakes from causing issues are much better off.


> Only the final goal of improving outcomes matters, and indulging the emotional human demand for punishment does not necessarily help that goal, it can even seriously hinder it. If only we could apply the same rationality to, say, the penal system.

I completely agree. Humanity still has to grow up, to realize that helping other people is lot more efficient than punishment.

I like the quote: "Emphasis on punishment is the sign of an obedience frame."


A very good, in-depth look at this is here:

http://www.newstatesman.com/2014/05/how-mistakes-can-save-li...

A story of how a pilot's wife died in surgery and how he campaigns to have doctors follow the example of airline crash investigation.

Particularly interesting/infuriating is the bits about how often someone in the room knew that someone was going to die, or get the wrong leg amputated, but status games prevented them from intervening.


> Only through proper investigation was it discovered that a key factor was clinicians failing to put sterile dressings over the catheter site; the medical equivalent of not using antibacterial hand gel.

> The introduction of a five-point checklist - a marginal change - saved 1,500 lives over 18 months in the state of Michigan alone.

If you find this kind of thing interesting I can recommend The Checklist Manifesto. The book describes many other situations where the use of checklists has had similar benefits (particularly in health and aviation).


> The book describes many other situations where the use of checklists has had similar benefits (particularly in health and aviation).

However, and I feel that this is a point often missed about this book: a large part of the improvements in process had little to do with checklists. "Wha-wha-wha-whaaaat? 'Checklist' is in the title!" Checklists don't do you any good if you're dealing with doctor's who are way to full of themselves and don't feel the need to listen to the little people like nurses.

No, the key element to that book was that nurses were empowered to knock doctor's off their high horses if the checklists weren't followed. The book isn't about checklists. The book is about:

1. Develop a framework of practices for your tasks at hand. Checklists are often used for this. For software, think release management. Tests pass, UA signed off, etc.

2. #1 won't make a damned bit of difference until, from top to bottom, anyone can call out anyone else if short cuts are taken. You're the director of the hospital? I don't care, you still need to wash your hands before touching that patient. VP of marketing is wanting us to skip those manual tests to save a day? Screw you, it's on the list and we're doing it.

The book isn't about checklists, it's about empowerment to make improvements.


I haven't read the book but what you're describing is what's called Crew Resource Management[1] in aviation. It's especially valuable in cultures where second guessing your superior is career suicide. CRM stresses that the captain and first officers are a team and how they should communicate.

[1] https://en.wikipedia.org/wiki/Crew_resource_management


It's been a few years since I've read the book, but if the book didn't specifically mention this in relation to aviation, I recall that aviation was at least was mentioned as an example.


I see your point but I think the point of the checklist is that only a system that requires the checklist to be followed properly actually qualifies. Without the checklist as the connection between the nurses and the doctors we would still have problems with consultants standing on their dignity and the nurses unable to do anything about it.


The author of the Checklist Manifesto also wrote an article about this in the New Yorker: http://www.newyorker.com/magazine/2007/12/10/the-checklist


Here is an article about medical checklists that is really eye opening:

http://www.newyorker.com/magazine/2007/12/10/the-checklist


The most eye opening part about that piece was a letter to the editor, from some bigwig doctor-administrator at a major hospital, that appeared an issue or two later. He thought that the whole idea of checklists was ridiculous, because all that is needed is for everyone to just do what they're supposed to do.

Seriously.

I'll try to find the letter. (I'm a subscriber and can access back issues online.) To me, his attitude illustrates the biggest problem of the medical profession. How could someone, who we have to suppose is an intelligent human being, miss the entire point of the article! In my opinion, only arrogance explains it.


Okay, the letter to the editor is from the January 21 issue, a few weeks later. Here's the part I take issue with:

"While a retrospective checklist has great value […] a prospective system of protocols, memorized by all the staff and made part of ongoing care, is more likely to influence patient outcomes."

You see that part about "memorized by all the staff"? One of the major points of the checklist piece was that simply going by memorized protocols, whether you're going to pilot a plane or perform surgery, just does not cut it. The human animal is prone to failing to follow complicated, extensive protocols from memory.

The doctor who wrote the letter is a professor emeritus at a medical school. A screen shot of the letter is here:

http://i.imgur.com/0DqhvY0.png


The phenomenom you're talking about is called "interrupted checklist" in aviation. Humans are actually quite good at following memorized lists unless they get interrupted mid-task. When this happens, the risk of a mistake goes up significantly, because your mind has already "checked off" the task that you were in the progress of completing when you got interrupted.

See for example: http://flighttraining.aopa.org/magazine/2002/October/200210_...

The psychology behind this is also the reason for the "sterile cockpit" procedures during critical phases of flight. https://en.wikipedia.org/wiki/Sterile_Cockpit_Rule


They have implemented a form "sterile cockpit" in some places when nurses prepare medications - by having them wear special vests that signal nobody should interrupt them, and (i think) doing it in a special quiet room.


I would have to guess that interruptions happen most often when things don't proceed exactly according to plan; but, honestly, I'm sure that never happens during surgery.

(In the interest of full disclosure, I checked the box marked "sarcasm" after writing my reply.)


In a similar vein, Andrew Godwin gave an excellent talk at PyCon last year called "What can programmers learn from pilots?" https://www.youtube.com/watch?v=we4G_X91e5w


I suspect a lot of software developers would benefit from a similar approach when it comes to improving quality and, in particular, improving security practices. There seems to be a striking dichotomy in our industry today between those who welcome responsible disclosure or even actively encourage it with things like bounty programs, and those who consider anyone knowing a potential vulnerability in their systems a danger and will treat even a genuine attempt to help through responsible disclosure as some sort of threat to be handled by lawyers or police. I'm fairly sure about which group generally makes more secure software. However, it would be a welcome cultural shift if not only support for responsible disclosure but also publicly acknowledging and discussing security issues were more common in our industry, so others could learn from the same mistakes.


Somewhat off topic, but as someone with an interest in aviation I enjoy reading the aviation safety reports, like the aaib reports [0], airprox reports [1] or the CHIRP newsletters [2]. (All UK based)

I try to find a similar 'tone of voice' when conducting or participating in RCA's for incidents during my day job

[0] https://www.gov.uk/aaib-reports

[1] https://www.airproxboard.org.uk/Reports-and-analysis/Monthly...

[2] https://www.chirp.co.uk/


> I try to find a similar 'tone of voice' when conducting or participating in RCA's for incidents during my day job

Me as well though, as a pilot, I find that the NTSB is a little too free in about 10% of cases in assigning blame to the flight crew as the primary cause. Yes, most crashes are crew-caused, but I've read many where the crew gets the blame as primary rather than as secondary.

In my day job, I've also brought in the concepts of FAR 91.3 and have explicitly given that level of authority to the team responsible for running our production systems.

FAR 91.3 reads:

Responsibility and authority of the pilot in command.

(a) The pilot in command of an aircraft is directly responsible for, and is the final authority as to, the operation of that aircraft.

(b) In an in-flight emergency requiring immediate action, the pilot in command may deviate from any rule of this part to the extent required to meet that emergency.

(c) Each pilot in command who deviates from a rule under paragraph (b) of this section shall, upon the request of the Administrator, send a written report of that deviation to the Administrator.

The intent is to make clear to the team that they have the final say and that, while I may make inquiries later as to why they made a certain deviation from our norms, SOPs, or policies, that they do have the right to do so, and that all I'm able to ask for is a report. Couple this with demonstrated blameless treatment of post-mortems and you get a pretty good outcome out of good people. We do want to know exactly WHO did WHAT, WHEN, HOW, and WHY, but we use to introspect and make the future better, not to punish.


You might also enjoy the Rail Accident Investigation Branch's reports. They are written in a similar style.

https://www.gov.uk/raib-reports


There's a book "The Checklist Manifesto" which describes this practice - a great, fast read. http://amzn.to/1SsqgLu


From the article:

> Today, aviation is arguably the safest form of transportation. Last year the accident rate had dropped to a low of only four fatal crashes from a total of 37.6 million flights.

I'm curious, does anyone know how/from where that was derived? That sounds pretty incredible.


Presumably the FAA would have records of all flights as you have to submit flight plans to ATC. Then they just see how many crashes there were that resulted in loss of life.

I mean, I don't know where I would personally pull those numbers from, myself, but it seems trivial for an Agency to report on.


Those aren't the "aviation" numbers, but rather more likely the "scheduled airlines" numbers.

The fatal rate for all of aviation is well over 4 accidents last year, and there are not flight plans for even half of the general aviation flights.

There were 5 aviation fatals last month in the US: http://www.ntsb.gov/_layouts/ntsb.aviation/month.aspx

For scheduled service, I only see one fatal accident last year in the US, and that was a Cessna 207 (single engine, piston aircraft) that crashed in Alaska, hardly relevant to scheduled airline service in transport category twin-jets.

http://www.ntsb.gov/_layouts/ntsb.aviation/brief.aspx?ev_id=... I suspect that most people would not consider this airline service, though it was operating as scheduled service.

Non-US in 2015 had 4 fatal accidents on scheduled airline service, which is presumably the numerator the article is citing:

GermanWings 9525 (suicidal pilot): https://en.wikipedia.org/wiki/Germanwings_Flight_9525

TransAsia 245 (pilot error subsequent to mechanical failure): https://en.wikipedia.org/wiki/TransAsia_Airways_Flight_235

Triguana 257 (poor weather, unclear pilot contribution): https://en.wikipedia.org/wiki/Trigana_Air_Service_Flight_257

Metrojet 9268 (bomb): https://en.wikipedia.org/wiki/Metrojet_Flight_9268


A checklist is a useful form of technology assistance, although ancient. Maybe some sort of AI technique will make it easier in the future. Sort of an assistant to guarantee conformance.


This is directly applicable to production outage processes in our industry, as well.


That red is the safest light.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: