What seems surprisingly ignored by most of the medical community is that many microorganisms harmful to humans can thrive only in environments that have been artificially stripped of the normally dominant microorganisms that humans are mostly well adapted to. Instead of fighting an ever-escalating arms race against resistant microorganisms in some imaginary “war on germs,” maybe we ought to stop screwing with the normal microbial environment and start learning from it instead.
I mean, is it really so surprising that when you salt sliced cabbage and let it sit in a crock for a few weeks – no preservatives, no antibiotics – that what you end up with, after the ambient microorganisms have had their way, is not only perfectly preserved cabbage but also delicious? That fermented foods naturally arise when you let traditional foods “go bad” and that they taste so good ought to suggest something. It’s almost as if millions of years of natural selection in an environment devoid of modern preservatives and antimicrobial agents has made humans well adapted to (and even crave) the microbiota that naturally dominate the places where humans have traditionally lived and the foods that humans have traditionally eaten.
Taking us out of those environments makes our bodies and foods and hospitals competiton-free zones for modern “superbugs.” Seems like a really bad idea.
It's all well and good until you're immunocompromised. S. aureus itself is commensal. Ditto E. faecium, K. pneumoniae and Enterobacter. Meanwhile A baumanii and P aeruginosa are widespread in soil. Together these represent all of the ESKAPE pathogens, which are the primary superbugs: Enterococcus faecium, Staphylococcus aureus, Acinetobacter baumanii, Pseudomonas aeruginosa, and Enterobacter [all Enterobacter]. That is to say: the six bacterial strains [excluding Mycobacterium tuberculosis, guess what it causes] carrying the most severe antibiotic resistance and responsible for the worst infections are the very same strains present all around us naturally. However, in every case except S aureus, infection is basically unheard of when an individual has a functioning immune system; they are all opportunistic pathogens. This is a part of the problem: a healthy individual can carry a dangerous strain around with no idea of the risk they present to the immunodeficient.
So what is to be done with the immunocompromised? There are some heroic medicine case studies of an "immune system transplant", notably the remission of HIV, and perhaps this method could be made more viable.
Don't you suspect that a large portion of "a functioning immune system" is the naturally occurring human microbiome (e.g., especially in the gut)? So isn't any rational response to the question What is to be done with the immunocompromised?, in part, to stop compromising people's immune systems by screwing up their microbiomes?
Yes, people become immunocompromised in other ways, but I'd wager that a large portion of compromised immune function in modern society is caused by our screwing around with antibiotics and other antimicrobial agents on a large scale without understanding what the hell we're doing. The "germs are bad" model of medicine is overly simplistic and gets important things wrong and, as a result, is long-term harmful.
Until mainstream practice catches up with the reality that our environment, including our microbiota, are part of the properly functioning human organism, we're going to cause a lot new problems for ourselves by trying to fix the "old" problems.
Errm, you are aware that people got these infections, and died in droves from them, long, long before the age of antibiotics, or "germs exist", let alone are "bad"?
Who is claiming that microorganism-involved disease was nonexistent before the "war on germs"? My claim is that in light of what is now known about the human microbiome and its role in immune function, many common practices for "fighting germs" can be predicted to have harmful effects, particularly in the long run.
Do you think these practices are that effective? One of scythe's points is that the ESKAPE bacteria are ubiquitous, in people and the environment.
Sure, taking an antibiotic deranges your microbiome, but doesn't it seem to most of the time readjust?
My specific point is exactly what observed phenomena are you claiming this explains? Until you have that, or a solid, specific hypothesis leading to bad specific things, I suspect the rest of us will continue to use antibiotics when we get otherwise likely lethal infections. I brought up "microorganism-involved disease" because I'm not aware of any significant changes in how they work, play out in the human population, etc., besides the obvious.
Certainly, if you have an "otherwise likely lethal infection," take the antibiotics. If they screw up your immune response down the road, it's still better than dying now. I need to be clear that I'm not talking about this specific case but the general practice of fighting what are believed to be "bad germs" using techniques that can be expected to screw up the human microbiome and, as a result, weaken the body's immune function, making it more susceptible to exploitation by opportunistic pathogens.
As scythe wrote,
> That is to say: the six bacterial strains ... carrying the most severe antibiotic resistance and responsible for the worst infections are the very same strains present all around us naturally. However, in every case except S aureus, infection is basically unheard of when an individual has a functioning immune system; they are all opportunistic pathogens. [Emphasis mine]
Right now, the emphasis in medical practice is on attacking the "bad germs." What is mostly ignored is cultivating robust immune function, which as scythe points out would mostly prevent opportunistic pathogens from doing harm in the first place. That's probably because until recently nobody knew much about improving immune function. But now, with a better understanding of the human microbiome, it's becoming hard to ignore that the techniques used to fight the bad germs (e.g., broad-spectrum antibiotics) are probably also having harmful effects on the human microbiome and, as a consequence, immune function. So the short-term fix is likely to also cause long-term problems.
To give you an example of what I'm concerned about, let me share what happened to my wife a few months ago. She had a rash on her lower leg, near her ankle, and went to see her general practitioner. Her GP didn't know what the rash was but nevertheless prescribed an anti-itch cream and -- surprise! -- a course of antibiotics, "just in case." (It turned out that the rash was mild poison ivy, so the antibiotics would have done nothing to help.) For many practicing doctors, broad-spectrum antibiotics are used like this as a general prophylactic for all manner of minor ailments. Lots of people are having their microbiomes decimated for very little benefit.
> Sure, taking an antibiotic deranges your microbiome, but doesn't it seem to most of the time readjust?
By what mechanism would eliminated strains be repopulated? And if they weren't repopulated and if your microbiome didn't "readjust," how could you tell? After all, even if such derangements were permanent, we would not expect to observe them until they were challenged by opportunistic pathogens and exploited. It could be, therefore, that such derangements have an invisible cumulative effect: the more screwed up your microbiome becomes the more at risk you are, but unaware of it until you get sick.
The medical community is only a small part of the problem. Patients demand antibiotics, people don't finish their charted courses of drugs or take them incorrectly and TVs yell at you to sterilize things, buy harsher drugs etc. What are the first things relatives do when they go to see a relative in a hospital? Give them a kiss, pat on the arm, touch stuff etc. It's not so much a medical problem as a human nature problem. I disagree that we have made the world less safe, do you have a source showing that human deaths from infection are rising or higher than pre-antibiotics? MRSA death numbers (but not death rates) have been falling according to the below link.
http://m.guardian.co.uk/news/datablog/2012/aug/22/mrsa-relat...
I mean, is it really so surprising that when you salt sliced cabbage and let it sit in a crock for a few weeks – no preservatives, no antibiotics – that what you end up with, after the ambient microorganisms have had their way, is not only perfectly preserved cabbage but also delicious? That fermented foods naturally arise when you let traditional foods “go bad” and that they taste so good ought to suggest something. It’s almost as if millions of years of natural selection in an environment devoid of modern preservatives and antimicrobial agents has made humans well adapted to (and even crave) the microbiota that naturally dominate the places where humans have traditionally lived and the foods that humans have traditionally eaten.
Taking us out of those environments makes our bodies and foods and hospitals competiton-free zones for modern “superbugs.” Seems like a really bad idea.