> In practice, much of their thinking is profoundly rational and bayesian.
Right, this was the third option I mentioned; I'm certainly not leaping all the way to the conclusion that one shouldn't listen to a doctor about the best course of action after mammogram results[1]. If their explicit understanding of mammography's false positive rate is so incredibly flawed, there is presumably an implicit counterbalance in the calculus that's built on experience (both their own and their mentors'/institution's), or an _order-of-magnitude_ error would show up in patient outcomes. I'd guess that this and the other instances of critical thinking failure that plague medical culture have their rough edges sanded over time, through decades of what is effectively guess-and-check iterative evolution, combined with institutional transmission of this folk wisdom.
Though I disagree that I would call this "profoundly rational", as IMO leaving explicit reasoning tools on the table instead of intentionally combining them with intuition/experience/epistemic humility is optimal. Iterative evolution is not an efficient process, and adding an attempt to explicitly model the functions you're learning can be a powerful tool. It's very difficult for me to imagine that a doctor explicitly internalized the basics of Bayesian reasoning wouldn't make them at least marginally more effective in terms of patient outcome, etc. Medical history is full of blind alleys built by medical hubris like your comment's "doctors know A Deeper Truth in their soul that defies explicit logical articulation". (Though I should note I don't claim to have a pat answer to this problem: one can theorize about improving a specific doctor's effectiveness, but scaling it to the whole culture is another, and can bump into everything from supply problems to downstream cultural impacts with negative consequences)
[1] Though with knowledge of flaws in such basic reasoning skills in one subpart of the total calculus, a patient can't rationally escape updating in the direction of checking their reasoning more thoroughly. Medicine is a collaborative endeavor between patient and doctor, both bring flaws in their reasoning to the table, and stronger evidence of a huge flaw in reasoning should lower confidence in the overall decision (though at a much lower magnitude, for the reasons we both describe here). This is the same logic that doctors use to rationally discount patient's opinions when they don't perceive them as coming from, eg, overly emotional reasoning.
Right, this was the third option I mentioned; I'm certainly not leaping all the way to the conclusion that one shouldn't listen to a doctor about the best course of action after mammogram results[1]. If their explicit understanding of mammography's false positive rate is so incredibly flawed, there is presumably an implicit counterbalance in the calculus that's built on experience (both their own and their mentors'/institution's), or an _order-of-magnitude_ error would show up in patient outcomes. I'd guess that this and the other instances of critical thinking failure that plague medical culture have their rough edges sanded over time, through decades of what is effectively guess-and-check iterative evolution, combined with institutional transmission of this folk wisdom.
Though I disagree that I would call this "profoundly rational", as IMO leaving explicit reasoning tools on the table instead of intentionally combining them with intuition/experience/epistemic humility is optimal. Iterative evolution is not an efficient process, and adding an attempt to explicitly model the functions you're learning can be a powerful tool. It's very difficult for me to imagine that a doctor explicitly internalized the basics of Bayesian reasoning wouldn't make them at least marginally more effective in terms of patient outcome, etc. Medical history is full of blind alleys built by medical hubris like your comment's "doctors know A Deeper Truth in their soul that defies explicit logical articulation". (Though I should note I don't claim to have a pat answer to this problem: one can theorize about improving a specific doctor's effectiveness, but scaling it to the whole culture is another, and can bump into everything from supply problems to downstream cultural impacts with negative consequences)
[1] Though with knowledge of flaws in such basic reasoning skills in one subpart of the total calculus, a patient can't rationally escape updating in the direction of checking their reasoning more thoroughly. Medicine is a collaborative endeavor between patient and doctor, both bring flaws in their reasoning to the table, and stronger evidence of a huge flaw in reasoning should lower confidence in the overall decision (though at a much lower magnitude, for the reasons we both describe here). This is the same logic that doctors use to rationally discount patient's opinions when they don't perceive them as coming from, eg, overly emotional reasoning.