Hacker News new | past | comments | ask | show | jobs | submit login
Antibiotic resistance: The last resort (nature.com)
200 points by dn2k on July 27, 2013 | hide | past | favorite | 123 comments



Things like this make me increasingly sad for our priorities as a society.

I realize that not everyone can be a biochemist and do antibiotics research, but we've reached a place where our most technologically savvy people are frittering their talents on food delivery services and cat videos. I have an advanced degree in biochemistry, and it's still basically impossible to get work doing antibiotics development. There's no money in it. I work on websites, because that's where I need to be to earn a living. If I could get venture capital to do speculative work developing new classes of antibiotics, I'd do that in a heartbeat.

What's the point? I don't really know. Rome is burning, I guess. Bring on the bread and the circuses.

(Postscript: $200M is considered a big initiative in this space. We spend BILLIONS on niche diseases: http://online.wsj.com/article/SB1000142412788732397500457849...)


Yeah, there is something that is regularly overlooked about developing antibiotics: new ones end up on a shelf as the new last resort. If you've got a timeframe (imposed by patent lifetimes) in which you have to recoup your investment, that is a shitty situation for a pharma company to put itself in. "It's for the public good" is not the kind of due dilligence many shareholders like...


You can consider this problem in two ways. Either you see the need for increasing the patent lifetime - at least for certain drugs - or you consider this a weakness of a privatized pharmaceutical industry, and take it as an argument in favor of placing the government in charge of pharmaceutical developments.


I believe it shows the general insanity of the whole health and food-production systems.

As I understand the situation, antibiotic resistance isn't a generally useful adaptation - bacteria tend to lose it "in the wild". So prevalence of antibiotic resistant bacteria, in non human-to-human born diseases like Staph infections, is a somewhat direct result of maintaining a long term, antibiotic filled environment.


Yes and no. We do overuse (and mis-use) antibiotics, but as long as we use them at all, bacteria will have an evolutionary pressure to adapt to their presence. So while we can mitigate antibiotic resistance by reducing use, we'll only slow down the problem.

Antibiotics have changed the way we live, but the price of that change is that we have to invest in the arms race. We invest in all the other arms races, so why not this one?


"We do overuse (and mis-use) antibiotics, but as long as we use them at all, bacteria will have an evolutionary pressure to adapt to their presence."

Not necessarily a decisive pressure however. For bacteria which primarily "prey" on human beings this would be true. For bacteria which are primarily endemic to the environment, the loss of fitness involved in antibiotic vulnerability is something of a drop in the bucket. Remember this resistance has a cost in physiological adaptation too. These bacteria, like Staph, that might be "happy" to subside to resisting the common chemicals in a barnyard - unless we human produce an environment where tolerating antibiotic gives the organism a natural bridge to spread through - gigantic hospitals with filled with humans pumped with antibiotics or huge feed lots filled with similarly pumped animals "come to mind".


When very bad outbreaks start to occur because antibiotics' effectiveness starts dropping precipitously, I think there will be progress. It will take a few years to get ramped up, but given modern technology I think there will be some interesting partial solutions fairly soon after that.

The Russians were particularly interested in this sort of thing during the latter part of the Cold War, as I'm sure the West was, and still is. They were looking at enhancing the immune system as an alternative to antibiotics or vaccines for agents that could not necessarily be fought with antibiotics or vaccines (i.e. biological terrorism). They also looked at using immune response as a bioterror target (e.g. something like the reaction to TGN1412 but using a transmissible bioagent instead of a non-biological agent.) They were keenly aware that the two problems were linked.

Obviously, immune response is bad at all times when there's no active threatening infection to fight, but needs to be maximized at least along certain paths when there's a threatening infection. That was a difficult problem with 1980's technology. I'm optimistic that it's not quite so intractable a problem, at least in cases of common bacterial infections typically treated with antibiotics today. More narrowly targeted immune system boosting seems plausible now that there's a somewhat better understanding of immune system interactions...


"enhancing the immune system"

Note that one possible flip side of this is auto-immune diseases.

"immune response is bad at all times when there's no active threatening infection to fight"

I don't think that's at all true. Our immune systems tend to work wonderfully most of the time given that we're constantly bathed in potentially pathogenic bacteria. In this discussion scythe says of "the ESKAPE pathogens, which are the primary superbugs" that "in every case except S aureus, infection is basically unheard of when an individual has a functioning immune system; they are all opportunistic pathogens". I take that to mean that for a lot people, temporary immune system dysfunction, e.g. due to other insults to the body, are a big factor.

I noticed on Wikipedia that one of the ESKAPE bacteria, Acinetobacter baumannii (http://en.wikipedia.org/wiki/Acinetobacter_baumannii), "Colloquially, A. baumannii is referred to as 'Iraqibacter' due to its seemingly sudden emergence in military treatment facilities during the Iraq War. It has continued to be an issue for veterans and soldiers who serve in Iraq and Afghanistan. Multidrug resistant (MDR) A. baumannii has spread to civilian hospitals in part due to the transport of infected soldiers through multiple medical facilities."

That I assume is largely due to gross trauma (and lots more soldiers surviving because of much better body armor (even helps in non-comat accidents I've read) and ever better treatment) ... and e.g. car crashes in the US tell us we're not soon going to be rid of that problem....


I realize that auto-immune disease is a consequence of overactive immune system, and I thought I alluded to that in the last paragraph. I guess I don't believe that the only way to enhance the immune system is to make it more active in general or in ways that would necessarily lead to increased instances of autoimmune disease.

I should have said wherever rather than at all times when. I don't try to discuss this a lot so I guess I'm not as precise at organizing my thoughts as I'd like to be. Thanks. Yeah, "superbugs" are disproportionately at hospitals, and garden variety bacterial species; they cause problems because patients tend to have wounds or compromised immune systems, and because the frequency of antibiotic use at hospitals encourages thriving populations of resistant strains.


Ah, yes, I can see that now, "[m]ore narrowly targeted immune system boosting" is definitely an option. Although ... do we want everyone in the country/world taking the same "ImmuneBoost(TM)"...? (Not a particularly useful criticism, but still something to also think about.)

Especially if we can find ways sufficiently weighted towards bacteria. The major thing that saves us with them is that their significant low level differences from eukaryotic cells, e.g. those smaller and different ribosomes that provide a handy target for various antibiotic families (they're the organelles that synthesize proteins). Or their plant style cell walls.

(I stopped paying attention to the details of immune system in the late '70s when I noted how fast we were learning about it, i.e. for me it's something to learn/review "on demand", so I can't make any really specific suggestions as to what to do or judge the feasibility.)


I'm a fan of distributed separately-evolving defenses, and identical ImmuneBoost(TM) given to everyone, that has the same effects in everyone, could I suppose create susceptibility to some HIV-style pathogen that triggers and/or hijacks the specific boosted aspects of the immune system. HIV is very careful (i.e. has evolved) to stay dormant for a long time... it probably didn't start out that way, and suffered as a result. Any sort of pathogen that targets the immune system has to be similarly careful; otherwise it suffers the same fate of all extremely lethal pathogens: killing the host population faster than it can spread.

Another promising research avenue is bacteriophages. I don't know why it's being pursued more in the former Soviet Union than it is in the West. I suppose even with targeted bacteriophages there's some risk that they could mutate and start targeting beneficial bacteria, which would be a problem.

A cute CS-inspired theoretical approach would be to throw variable layers at the problem. With genetic engineering, create bacteria that function as a supplemental immune system, but in a very narrow way; the bacteria would, possibly in collaboration with the host immune system, generate specific bacteriophages for any pathogenic bacteria they come into contact with. Obviously it wouldn't work against pathogenic viruses, but other than the (perhaps impossible) difficulty of having a cell able to generate novel bacteriophages, it doesn't seem so outlandish. If we get to the point where we can write code for a wide variety of biological activity and get it translated into DNA, then such a bacteriophage-generating bacterium might be possible.

I realize there are pathologies imaginable with any possible solution. The question is how likely is it that natural or engineered pathogens will target whatever new targets are available. Bacteriophage-generating bacteria could be hijacked by viruses, mutated by gene transfer with harmful bacteria to do bad things, or even give their bacteriophage-generating properties to pathological bacteria which might then gain mutations to generate human-targeting viruses... wouldn't that be fun.


> When very bad outbreaks start to occur because antibiotics' effectiveness starts dropping precipitously, I think there will be progress.

Hospital acquired infection is a significant cause of death.

> In the United States, the Centers for Disease Control and Prevention estimated roughly 1.7 million hospital-associated infections, from all types of microorganisms, including bacteria, combined, cause or contribute to 99,000 deaths each year.

> In Europe, where hospital surveys have been conducted, the category of Gram-negative infections are estimated to account for two-thirds of the 25,000 deaths each year.

(http://en.wikipedia.org/wiki/Hospital-acquired_infection)

These are just nosocomial (hospital acquired) and not necessarily antibiotic resistant.


Resistant infections in hospitals must not yet be a significant enough problem to get lots of smart people working on the problem. Or, rather, they're working on trying to contain infections and reduce transmissions rather than on how to treat the infections, right? The other shoe has to drop before other half of the problem gets serious attention.


Interesting. I had a bacterial infection (chronic) lately and was thinking of doing biochemistry as there was no cure in the market for my infection. Surprisingly, honey did a lot more than any class of antibiotics that I tried.

Can I reach you by email?


I use honey along with something called the specific-carbohydrate diet to control (but not eradicate) an infection in my gut.

Certain honeys -- and not particularly the more expensive ones -- are much more antibacterial in this regard than others.


I also thought of natural antibiotics like honey or garlic as substitutions, but I also thought that they would only help with gut infections (e.g. no use for infected hip joint after transplant) - is that not so?


I'm using it with my eyes and it's helping my blepharitis more than steroid and antibiotics did. I don't have deep biochemistry knowledge, but I guess they already tried to find out the antibiotics properties of honey/garlic and it didn't help much in real scenarios.


> I guess they already tried to find out the antibiotics properties of honey/garlic and it didn't help much in real scenarios.

Speaking more generally, there is limited interest from big pharma in the use of natural or herbal products because they will have to spend a ton of money on clinical trials to prove effectiveness, but they won't get the patent protection necessary to recoup the cost of R&D, like they would with synthetic drugs.


I've dreamed about doing an app for crowd-sourced clinical trials. Obviously something like this wouldn't have the level of control as having participants under direct supervision, but it could be close to free if done right.


They just change one atom in the big ole organic molecule that doesn't change the effectiveness, and boom: new and patentable drug.


Presumably you'd then have many more possible patentable, innocuously "different" molecule patterns, ruining the competitive edge.


Just follow the money. There are plenty of research dollars for cures for diseases that afflict rich people. Just claim that you're targeting cancer in some way and cash the checks.

(At least that's the way it was when I was still in the field.)


Allow me to assure you that saying " I just program websites " is selling yourself short.

At the VERY least I urge you to design novel experiments and post them publicly.


> Davies, the United Kingdom's chief medical officer, described CREs as a risk as serious as terrorism

This is a descriptive strategy people need to use more often to put things into perspective for the average person and taxpayer. Many diseases pose risks vastly more serious than terrorism, but receive far less attention.

However, if you're aware of the actual risk of terrorism, something that is "as serious" as terrorism is really not very threatening.


That's part of it. But the more important part is that "black swan" events can have serious repercussions on people's perception of risk. If we had the 9/11-equivalent of an antibiotic-related problem - unexpected, sudden and with thousands of deaths - then the entire country would start pouring billions into making sure it doesn't happen again.

edit: It is my belief that most problems don't get treated as "real" in people's minds unless they are two degrees of separation or less from someone who is affected.


Yes. When I read that I thought "so, over-hyped and not really a problem, then".


The predicted 1/6 death rate for routine surgeries made a big impression on me. A literal roll of the dice. Think of everyone you know who's had a surgery, and it's obviously a radically bigger problem than terrorism. I'd like to see a comparison to death rates from common cancers or heart disease that some people actually worry about.


That might be, shall we say, a somewhat short term problem as the population is culled of less hardy individuals. When I think of the Mayo brothers using the Murphy Button zillions of times to splice together someone's small intestines, I realize that a lot of dangerous surgery was done with enough success in the pre-antibiotic era. One in which my parents, born in the mid-30s, grew up in; their attitudes towards sickness and death are markedly different than people younger who e.g. wouldn't have survived without antibiotics.


Please, do realize that if resistances were free, we would all be resistant already. We lived with those infections far longer than we did without. TANSTAAFL applies here as well, and even if the evolutionary pressure gets high enough to cause change (which will be far, far later than you'd like it to be, i.e. serious social trouble probably has a significantly lower threshold), it will be at a cost that is hard to predict, though likely it would be lowered resistances to something else, or maybe a higher risk of autoimmune diseases. Or something completely different.


For those like me: There Ain't No Such Thing As A Free Lunch.


Infectious disease has been a huge killer of humans since humans existed. The idea that this would be a "short term problem" makes no sense. If evolutionary pressures could handle it, there would have been no need for antibiotics in the first place.


Errr, I meant to say there would be a high rate of less "hardy" people dying off for "a somewhat short term" until we returned to the pre-antibiotic rate.


I hope your original comment was tongue-in-cheek - the pre-antibiotic world was a pretty bad place by modern standards.


I thought that was the pre-antibiotic rate.


  > That might be, shall we say, a somewhat short term 
  problem as the population is culled of less hardy individuals
You know what's the real problem? The culling will most likely happen in areas that people have no experience with fighting bacterias on their own i.e. the most developed parts of world, that means that a LOT of progress will be lost. Those that read this site are those most likely to be culled.

Anyway silly question. If the bacterias are so well at adapting to antibiotics, can't we create some kind of mold that can adapt to the bacteria? E.g. take bacteria A spread it to N + 1 Paetri dishes, put some mold in N dishes and select which is better. Then take the last remaining dish and create a new set of N+1 bacteria dishes.


Bacteriophages are pretty good at this. And used in some countries.


The problem with bacteriophages is that they require identifying the specific infection, and then identifying a bacteriophage that is suitable for just that infection.

In comparison, part of the reason we're in this situation in the first place is because doctors can't even be bothered to figure out which antibiotic is suitable, or whether or not you actually have a bacteria infection, but often just prescribes first-line antibiotics and tells you to come back if you're still ill afterwards. That means 5-10 minutes per patient many places.

If we were to try to switch to bacteriophages, on the other hand, it'd mean blood tests and lab work to identify the specific infection and strain for every patient, with resulting massive cost increases.

In it's current state, it's a last way out, but it's far away from being a viable large scale replacement for antibiotics.


In plenty of cases doing lab work is impractical, and doing it the old fashioned way that makes sure takes more than a day; plenty of times doctors don't have that long. Plus there's the competitive pressure, people fire doctors who don't give them the antibiotics they demand.

WRT to the first problem, starting in 1995 I got recurrent sinus infections (which are now mostly prevented with a couple of nasal sprays). A sinus biopsy is not practical, instead my doctor, knowing what locally worked, gave me one particular antibiotic. Later a standard formulation of that plus an anti-β-lactamase drug. Then the one that still works (a different β-lactam family), but I need 30 days of it.

When I was unemployed and only had catastrophic medical coverage I got a full course from samples of a new, lipophillic drug that builds up a concentration in your fat and you therefore don't need to take it for so long, and the constant dose no doubt helps. But that one failed for the next sinus infection and I was back to the 30 day one....

Errm, when it's pretty clear the patient has a bacterial infection, why isn't it often a best practice to give a first line drug while doing a culture? Strep throat is the classic example I can think of from when I and my siblings got that in the '70s ... of course, although I don't know if it was realized by then, giving an antibiotic for it is mostly prophylactic, to avoid rheumatic fever. I have a very good friend who's about a decade older than me, when she got that her mother completely freaked out because prior to right then it was a death sentence. Fortunately penicillin was just making it out into civilian use....


Oh, absolutely, I didn't mean to suggest it's practical to always do lab tests, though I can see how it'd seem that way. It's an unfortunate and at least partially necessary tradeoff not just because it's impractical to do lab work for every patient, but also prohibitively expensive.

I meant it largely to contrast it with bacteriophages, where we don't get the choice. Antibiotics "won out" to a large extent because alternatives like bacteriophages are tremendously more expensive and complicated compared to the practice of handing out antibiotics without blood work. If we took all available precautions to prevent resistance, then we'd face much reduced risks of antibiotics resistance, but the cost gap would reduce accordingly as well.

Antibiotics are so important exactly for the reasons that give us resistance: It is cheap, easy and fast for all but the unfortunate cases that run into resistance.

> Errm, when it's pretty clear the patient has a bacterial infection, why isn't it often a best practice to give a first line drug while doing a culture?

It often may be. The problem are the cases where antibiotics are given when it isn't even clear if its a bacterial infection - in the UK for example doctors will often give antibiotics "just in case". The second problem is that this means that antibiotics will quite often also be given in situations where the bacteria in question will not respond to the treatment, or will survive in sufficient number to cause the infection to rebound and leave the patient walking around potentially spreading an infection with increased resistance to the first line antibiotics.

Again, it is a tradeoff that sometimes is necessary, and very often the best recourse for that specific patient. But that doesn't mean it doesn't contribute to increasing overall levels of antibiotics resistance and that some of these decisions negatively affect society as a whole.

Incidentally if/when we can find technologies to speed up precise tests for specific strains, bacteriophages might become suitable for more situations and we might be able to limit use of antibiotics for the situations where we can't risk waiting.


Don't they have to be stealthy enough to avoid immuno detection but not stealthy enough to mutate into a virus that bypass human immune system?


Surgical death rates are already a radically bigger problem than terrorism, particularly in so-called western countries. Without looking it up I'd say with confidence that most causes of death were statistically more likely than death from a terrorist act.


More graphically, imagine you took a game of Russian roulette before every operation.


Something else that's scary is drug-resistant gonorrhea, which has been found in Japan. (http://www.reuters.com/article/2011/07/11/us-gonorrhoea-supe...)

I've posted this clip before, but in case you missed it:

(http://v6.tinypic.com/player.swf?file=24goih4&s=6)

Here's a short snippet from a BBC Television programme (Horizon - 'defeating the superbugs') (http://www.bbc.co.uk/programmes/b01ms5c6).

It shows E.Coli developing anti-biotic resistance. There's a tray of nutrient jelly. The jelly is divided into sections. It starts with no antibiotic. Then there's a normal dose. Then there's a 10x dose, followed by 100x dose, followed by 1000x dose. The limits of solubility are reached - they cannot dissolve any more antibiotic into the jelly.

Then they drop E.Coli onto the normal jelly, and use a time lapse camera to show the growth.

After just two weeks the bacteria is able to live on the 1000x dosed jelly.

It's pretty impressive demonstration.

(Apologies for the suboptimal hosting site. YouTube's contentID blocks this video worldwide.)


Thx so much for the video.

I am very confused though ... the documentary implies that the bacteria are developing the immunity through evolution ... over millions of cell divisions. But how can this be happening so fast? If a bacteria were to be countering the antibiotic via an enzyme, it would take an insane amount of time to develop that enzyme by pure chance. It is more likely that some of the bacteria already had the anti-antibiotic enzyme and we're seeing selection in progress. Can someone please please explain?


When I was doing some molecular genetics on E. Coli in the summer of 1977 I was told that for many antibiotics, 1 in a million of them would spontaneously mutate and develop resistance if exposed, and 1 in a billion for streptomycin.

Note that there are many methods of antibiotic resistance, including little pumps that try to keep the level low enough inside the bacterium. They've been developing them for a very long time, the other insight you need into this is the ecological one.

Most are from molds, and they release them to better compete with bacteria. And a resistant bacterial strain will not necessarily compete well with others lacking its mechanism(s) because those are otherwise maladaptive . E.g. it's spending raw materials and energy producing a β-lactamase while its competitors are dividing more rapidly. Only when you add the selection pressure of the antibiotic does it win big.


There's also a strong role for horizontal gene transfer in antibiotic resistance, e.g. by swapping plasmids, which can allow bacteria to swap higher-granularity mechanisms, rather than having to evolve them from scratch with random mutations.


This feature is basically bacterial sex and turns their evolution from a slow serial computation to a highly parallel computation that gives an exponential speedup in evolution.

I think this effect is the extremely under appreciated benefit of sex/exchanging dna.


It's a good question. Maybe that is what they are trying to figure out?

I've heard suggestions that the mutation rate can go up in a high-stress environment, perhaps that could be related?

I hope the HN research effect kicks in and someone finds and summarizes the research on these questions.


Amplifying on my answer in this subthread and I now see mjn's comment, it's complicated. Simple mutations only take you so far, they change something that's necessary for the antibiotic to work. E.g. a surface protein that brings it inside gets trashed, or a enzyme or structure that it works against changes enough to be resistant while still functioning enough for survival.

The nastier stuff, like β-lactamases---enzymes that destroy the β-lactam ring that is the "active ingredent" of so many antibiotics ... and that it's used by so many families that humans can tolerate is telling---are obviously a lot more sophisticated. Bacteria developed them to compete with the molds that produce β-lactam antibiotics, and they spread from one species to another, especially in plasmids as the article notes.

A great deal of this is not new mechanisms of resistance being developed, but long existing ones becoming prevalent. Somewhat like in the good old days when a drug company could make its money back bring an antibiotic to market, and scouring the globe for molds that produced ones, improper use of antibiotics (e.g. not switching to one of different mechanism quickly enough, especially likely in places like India where they're in practice available over the counter) plus widespread global travel is allowing the best mechanisms to survive and sort of thrive.

Not thrive in most environments of course, just in humans who are under the selection pressures of antibiotics.


>That means, say infectious-disease experts, that their best tools for defending patients remain those that depend on the performance of health personnel: handwashing, the use of gloves and gowns, and aggressive environmental cleaning. Yet even research that could improve best practices has been short-changed, says Eli Perencevich, an infectious-diseases physician and epidemiologist at the University of Iowa in Iowa City who studies how resistant bacteria move around hospitals. “We haven't invested in research in how to optimize even standard infection-control practices. We just blame the health-care workers when they go wrong.”

It seems that a possible positive outcome of this could be cleaner hospitals. No infection is better than a treatable infection. Even if CREs are controlled, even if new antibiotics are developed, these outbreaks will keep happening and resistance will keep developing. Developing effective yet practical hygiene procedures is the only way to solve the problem once and for all.


It's harder than just "cleaner". The bacteria killed in disinfection procedures tends to leave space for more horrifying stuff that's also more resistant – you won't really find http://en.wikipedia.org/wiki/Staphylococcus_aureus on toilet seats outside of hospitals, unless you went and used Vircon on them.


As a surgeon I know says, "the last place you want to be if you're sick is in a hospital."


Would introducing probiotics after disinfection do anything?


It's complicated, and quickly gets outside of my knowledge (I was only mildly involved in hospital stuff), but one big problem is, the same bacteria can be beneficial or lots of trouble depending on where it is, and you may be reintroducing exactly the same thing you just killed (think E.Coli).


The best type of "cleaner" here is soap. If they do become "resistant" to washing off, or then you still have your disinfectants and antibiotics.


I think that pure NaOH works better. It is almost impossible to have cell wall not made of some kind of fat acids. Doubt anything can survive such a strong base.


> Initially, most individuals carrying bacteria with the new resistance factor had some link to clinics in India

One big problem is that in India, lack (or poor enforcement) of regulations results in abuse of antibiotics, which leads to the development of resistance. People are also much more susceptible to infectious disease, due to poor public health policy (lack of clean water, etc.).


This is a big problem in SE Asia too. Here in Vietnam I don't think I've ever seen anyone come back from a visit to the doctor without antibiotics, and pharmacies dole them out like candy without prescriptions. What's worse is that they seem to always provide only a day or two worth of pills, which in most cases isn't enough to do anything.


OMG that's terrible ! In Belgium you are told to consume the entire package, even if you have been cured. And is not easily described unless you can't seem to beat it on your own. (At least with my doctor.)


How on Earth can this happen? Doesn't Vietnam have trained doctors? This practice is ridiculously dangerous? Why?


I've been to a Greek island a few years ago (Korfu). A friend of mine walked into a drugstore, told them he has a cold, and walked out with some broad-band antibiotic. No questions asked.


There are competent doctors here but my impression is that the typical level of training is pretty low.


A quick check with Google (indian pharmacy antibiotics) confirmed my understanding that they're basically unregulated (http://www.nature.com/news/india-moves-to-tackle-antibiotic-...):

"[...] antibiotics are readily available over the counter at pharmacies...."

"Last week, physicians moved a step closer to their goal when India’s drug regulators announced a plan that would put tight restrictions on the sale of antibiotics. Carbapenems and many other antibiotics are already on a list of 536 drugs in India that require a prescription. But studies have shown that such drugs are easy to purchase at retail pharmacies without a physician’s signature...."

This article from a bit less than a year ago then says the new measures will be adding red labels and surprise inspections....

Lots more hits where that came from, including plenty from India.


So under what circumstances does it make sense to embargo antibiotics to health systems that don't use them properly?


If they're major producers of pharmaceuticals, like India, I don't see how you can. You'd have to engage in some really serious and smart economic warfare (or worse), and it would be hard to avoid stomping on other drugs. And it doesn't take all that much to produce many basic antibiotics, they are after all originally from molds. Lots are "semi-synthetic", modified from the base version to have different characteristics.

And the usual suspect would scream bloody murder ... it's a difficult question of more people dying today to prevent even more people dying tomorrow, something people are generally bad at dealing with until the 11th hour when its often too late.


Many years ago, I remember seeing a BBC documentary about phages that were (and are) used in Russia and former Soviet states in place of antibiotics.

What are phages? "Phages are naturally occurring viruses that kill bacteria. Once they get into bacterial cells the phages' DNA replicates until it kills the host.

Doctors in Georgia, and in other countries that were in the former USSR, have been using this therapy for 90 years. But medics and drug regulatory bodies in most places in the developed world have been reluctant to accept that it works."

From: http://www.bbc.co.uk/news/health-21799534


Too bad they were largely abandoned because antibiotics were cheaper and easier to produce (a situation similar to electric vehicles in the early 1900's, in fact)...


It wasn't just that. The problem with phages is it's impossible to make enough for a therapeutic dose without introducing mutations, so you're never completely sure what you're giving the patient. They're just not as safe as conventional antibiotics.

We'll probably end up using phages because we won't have any other choice. But that's not a step forward for patients.


I'm not sure this analysis is accurate. At any rate, I don't share your concerns.

The big problems with phage are:

   1) Narrow spectrum
   2) Readily cleared by the immune system
   3) Bad at killing bacteria in comparison to antibiotics.
I worked with a a strain of Phi X174 that was optimized for a particular host by serial passages. It did a number on petri dishes, but I'm skeptical that it could have done much to clear wild type e. coli. Antibiotics are good and bad because they indiscriminately kill everything. If phage are every used in a widespread manner, I imagine they'd get deployed in a cocktail to broaden the spectrum.

It's easy to make highly pure phage. Colonies are clonal and form plaques when plated. If you did serial passages to ensure that the phage was near some local fitness maxima, it's unlikely that selective effects would move it away from that maxima. In Phi X174, which admittedly has a short genome (5386bp), I would have been surprised to see any clones with mutations. I was doing site-directed mutagenesis to change single base pairs, so I thought about this a lot. Worrying over a couple of base flips when mass producing phage is simply a hallucinatory fear.

At least one phage is dangerous, but I'm not aware of any other examples. CTX is a temperate phage that makes cholera produce toxin. There is a risk of immune response. The immune system tends to get pissy when you inject foreign matter. I've seen studies that compared oral dosing to intramuscular phage and IM seems to be the way to go for higher blood titers. In one study, serial passages were performed in rabbits. The phage acquired mutations in its capsid proteins and over several generations blood titers increased by something like two orders of magnitude. (I don't recall if they used the same rabbits or naive rabbits.) Makes you wonder if we could recover super fit phage from people receiving phage treatment?

Finally, with regards to safety, phage are currently in use as an anti-microbial again listeria.


Is "highly pure" really pure enough for the FDA? I knew a guy who worked on phages. The problem they ran into is when they submitted their virus to the FDA there was no way they could guarantee the absence of dangerous mutations in a given batch.

EDIT: I should say I didn't follow his work that closely, so it's possible there were efficacy problems as well.


The bar has certainly been met with regards to phage as a food additive: http://www.fda.gov/OHRMS/DOCKETS/98fr/cf0559.pdf

The error rate of an E.coli DNA polymerase in vitro in <9x10^-6. Lambda phage, for example, has a genome that's about 48 kilobases in length. Which gives a mean error rate of 0.432 bases / replication. Ballpark estimate that the phage goes through 30 replications to produce a plaque, so we're talking ~13 mutations (it's a little different than that due to the founder effect, etc).

It isn't like a phage is going to jump Kingdoms and start predating you instead of the preferred host. This simply isn't a realistic risk, while the immunogenicity issues could be.

In terms of my own risk tolerance, I would have been comfortable drinking any of my own samples.


>In terms of my own risk tolerance, I would have been comfortable drinking any of my own samples.

Hmmmm. Maybe so, but would you inject them?


If there was a medical need and they were shown to be efficacious, sure.

I'm currently receiving allergy shots which involves frequent injections of dust mites. Injections are followed by a 30 minute observation period as a caution against anaphylaxis. Staff are on hand to intubate. The frequency of life threatening reactions is around 1 in 10^6.

If phage were ever used clinically, I suspect the protocol would be similar. I mean, heck, some people have life threatening reactions to common antibiotics (I believe sulfa allergies are particularly common), but that doesn't keep them out of use.


The lack of advancement in antibiotics is a stark contrast to the recent improvements in treatments for so many chronic diseases.

Whatever the reason the free-market pressures here are so weak, antibiotic development should consequently be a priority in government grants. I wonder if there could even be intergovernmental cooperation here?


Huge costs and virtually no gain. A good antibiotic is used during a short term. Furthermore if things go on like today it will probably be destroyed within years by short-sighted, irresponsible use like fattening cattle. Any pharmaceutical company suicidal enough to start development with those odds will probably have put itself out of business long ago already.

This is an existential threat in many ways. Infections we shrug at today are going back to being life-threatening. And it isn't just infectious medicine that's losing power by this. Surgery, for example, get vastly more dangerous, difficult and expensive if there isn't effective antibiotics; in some cases downright impossible.

I think you are right about public funding to the research and expect to see it rise. Where I would really like intergovernmental cooperation is in making sure the new antibiotics are used in a responsible, sustainable way. Routine use like an agricultural food additive should be a felony as far as I'm concerned, and even when used for curing serious disease it should be high priority making sure cures are completed and risk of transmitting resistant bacteria is minimal. We have to protect antibiotics effectively if there is to be much point in developing new ones.


"Furthermore if things go on like today it will probably be destroyed within years by short-sighted, irresponsible use like fattening cattle."

Which of course explains the widespread use of vancomycin for that purpose.

Seriously, while this use of antibiotics is problematic, there's no evidence it's a big problem today---if there was it would be written in flaming letters across the sky.

"Where I would really like intergovernmental cooperation is in making sure the new antibiotics are used in a responsible, sustainable way."

How does this work? What happens to a bad actor who doesn't care or just can't get its act together?


> How does this work? What happens to a bad actor who doesn't care or just can't get its act together?

Governments could refuse entry to people who've visited that country.


And do what at the borders, many if not most of which are porous?


The vast majority of people moving between countries pass openly through airports or border checkpoints where they could be turned back. Yes, there are illegal immigrants and drug traffickers who sneak across the desert or dodge the Coast Guard in motorboats, but they represent an insignificant fraction of total human traffic and opportunities for carrying antibiotic-resistant disease.


The FDA is already creating incentives for antibiotic research.

http://www.fda.gov/Drugs/DevelopmentApprovalProcess/Developm...

Based on the seriousness of the infection being targeted, the FDA is creating accelerated approval pathways. There has already been some discussion about guaranteeing a certain amount of profit for the pharmaceutical company once it hits the market.

It's a start.


Markets are quite bad at developing any kind of reserves. So in the case of antibiotics: we have the antibiotics we need at the moment ( at least mostly). A new antibiotic would be quite expensive to develop, and then it would just sit on a shelf. Thinking about it, it is quite the same as the banking crisis in 2008, there was no secondary banking system, because that would operate at higher cost and therefore get kicked out of the market.


And your single first sentence (this was drafted before you added your second paragraph) hints as to why:

Chronic diseases cannot generally be "cured", just controlled, so patients need a lifetime supply of a drug.

A new antibiotic will only be used in the direst of cases, like vancomycin as mentioned in the article.

It costs so bloody much to get a drug to market that drug companies tend to chase the former, where they can in theory get their money back (that's actually not tending to happen overall), whereas the latter could bankrupt them. Asking the government or charities to play this game is asking for politics to further enter into it, the outcomes are not likely to be good.


I absolutely agree that the financial incentives are mismatched, with private companies less interested in trying to cure or improve underlying medical problems than in treating/controlling symptoms.

"Asking the government or charities to play this game is asking for politics to further enter into it, the outcomes are not likely to be good."

I'm curious where this die-hard free-market willful blindness comes from. I object to the categorical classification of government or charity being bad choices to solve problems.

The nature of the specific market under discussion makes all the difference in the world. The nature of the administrative path from funding to research & development also makes a difference. There can be efficient allocation of funds in government and in private sector charities, just as there can be inefficient allocation of funds due to bad management in private companies devoted to solving the same problems.

Free marketeers have effectively redefined "government run program" to mean inefficient... it may be that most of the time, but there is nothing inherent about government programs that demands inefficiency. There's a similar stigma regarding charity. Those are almost effects of free-market thinking: if everyone is convinced that government and charity are poor ways to solve any problem, then government and charity are left with incompetent administrators, bloated management structure, and poor oversight.

Since this problem is identified as one that the private commercial sector is poorly motivated to solve, the question is not whether public sector or charity should attempt to tackle the problems -- it should -- but how to structure charity or government programs so as not to waste the money they are given to address the problem.


When you talk about efficiency you're going off on a tangent I'm not intending; in this case, I'm much more worried about political choices on what approaches and drugs to pursue, and politics messing with the safety and efficacy testing. Also, anyone thinking there is anything like a "free-market" pharmaceuticals is simply not paying attention.

On the most general principles, I claim politics are a given in organizations of the scale we're talking about. My special claim about governments and charities is that for-profit companies have a governor in that if they don't make a profit, they'll suffer and eventually die. WRT to my testing point, I also don't want the same entity having a large hand in both developing and testing drugs. I'd also want to look at the current and past roles governments have had in pure drug development; it certainly can do amazing things like the wartime push of penicillin mass production, then again that's not the sort of problem we're having today with antibiotic resistance.

"Since this problem is identified as one that the private commercial sector is poorly motivated to solve...."

Which has a whole lot to do with the government's regulation of pretty much everything involved in getting a drug to market. Not everything, certainly, it's hard to develop these sorts of drugs, and I think its telling how many of the families use β-lactam rings. So I claim that changes in laws and regulations are in theory a very fruitful avenue. In theory because the demonization of "profit", especially in the context of saving lives, makes promises a government makes today highly suspect, "I am altering the deal. Pray I don't alter it any further."


So, continuing on the meta track, there are several cases when the private sector fails, which I'll try to categorize:

1. Government intervention screwing up what would work in a purely free market.

2. Markets that free enterprise is not equipped to handle. Massive front-end costs combined with unpredictable or insufficient future revenue (for instance, reducing or eliminating externalities in a way that the company can't capitalize on renders something unworkable for the free market).

3. Markets that have a pathological condition where they are inelastic and when there are less-desirable substitute goods one of which is vastly more profitable.

4. Markets that are monopolies or oligopolies: where there is insufficient competition.

Is it even productive to engage in a discussion focusing on category 1, when pharma has issues in all four categories, and reducing government regulation in any way even if it's merely to reduce unnecessary up-front drug development costs (which I agree is a noble goal) is likely to make some of those other pathologies worse?

edit to reduce clutter: fair enough.


Obviously we have grave disagreements here, enough in my opinion that I don't think it's productive for me to reply further. But I'd add at least a "5": too many monopsonies (http://en.wikipedia.org/wiki/Monopsony), e.g. nation states negotiating directly with drug companies, and insisting on prices barely above the costs of goods (well, I'm sure insisting on less, but settling on that).


>The lack of advancement in antibiotics is a stark contrast to the recent improvements in treatments for so many chronic diseases.

I'm curious to know what diseases you're talking about here. I don't see much improvement in this area at all.


Another reason (which I wish I had known) to treat antibiotics as a last resort is Clostridium difficile (C. diff). In addition to killing the bad bacteria in your body, antibiotics will also kill the good bacteria in your gut. This in combination with exposure to C. Diff (rampant in US hospitals) is a very bad thing.

When the good bacteria in your gut is killed, it frees up real estate giving any present C. diff an opportunity to overgrow and wreak havoc. If this happens, the road to recovery can be very long. And to add insult injury it takes bleach to kill any C. Diff you happen to spread to surfaces in your home and this often results in patients re-infecting themselves repeatedly.


I talked to the sellers at an estate sale, and found out the deceased, a lady in her 60s, had been killed by C. diff. Took about four days.


My Mom was I guess sort of a hippie ahead of her time but she used to always rail against anti-bacterial soaps and etc. She used to resist using the dishwasher and anything anti-bacterial, and would say that it was good for your immune system to be exposed to germs once in awhile. The joke in the family was that she just did not want to have to do too much housework, But actually, she may have been on to something.


A microbiologist at our local university was interviewed on our public radio station a couple years ago. He pointed out that antibacterial soaps may kill 99.5% of the bugs but the ones left are the resistant ones, with an adaptation that also makes them resistant to some human antibiotics. Then with the other bugs killed off they'll have a nice open field to spread.

A schoolteacher called in and said "I have to use antibacterial soaps, you have no idea what I have to deal with in my classroom." The guy replied, "You can't get rid of the bacteria. Your choices are a surface covered with normal bacteria, or a surface covered with antibiotic-resistant bacteria."


As laypeople we seem to think about this issue much like programmers think about algorithmic efficiency ... who cares if my code is messy, slow and non-optimal, all I need to do is wait a few years for CPU speed to double and nobody will notice.

i.e. most people think, just wait a few years until the scientists develop a better antibiotic. Cat and mouse.


Just out of curiosity does anyone know of research proving or disproving the viability of using 'safe' bacteria to defend against dangerous bacteria?

I've often wondered since we've selected for super bugs buy killing off the easy to kill ones with antibiotics, couldn't we intentionally reintroduce the easy to kill ones (or something similar but benign) so that the super bugs have no population advantage? Instead of being the only guys on the playing filed they'd have to compete with the benign bacteria which would, in my thinking, limit their growth potential.


That's interesting. We also could revive phage therapy which uses bacteriophages (viruses that replicate in bacteria).

http://en.wikipedia.org/wiki/Phage_therapy


Not quite the same thing, but do a google search for "fecal transplant"

Patients who have undergone antibiotic therapy will sometimes get infected with C. difficle bacteria. It's very hard to treat and can sometimes cause toxic megacolon.

If you take the feces of a healthy individual and transfer it to a C. difficle patient, the good bacteria will crowd out the bad bacteria and cure the patient.

The FDA just released some guidance on the procedure.


Yeah, I know about that, and I'm not talking about intestinal fauna, but rather infectious disease. Nevertheless, it is similar logic to what I'm suggesting.


We've been doing the probiotic thing for a long time, I can remember being fed not in the least tasty bacteria granules stored in foil wrappers in the fridge in the late '60s. But outside of the gut how could you do this? Skin isn't going to work, not with it being exposed to zillions of bacteria in the wild. Of course, those should be wild-type and not necessarily antibiotic resistant.


Reintroduce them where? Into my abdomen during my appendectomy?


Exactly so. Maybe, maybe not during surgery (depending on how benign we're talking), but perhaps at the first sign of non-responsive infection. Stop the IV antibiotics, and start up the 'good' bacteria drip.

This may of course be totally silly, but I'm just wondering if anyone has done any research on it.


What seems surprisingly ignored by most of the medical community is that many microorganisms harmful to humans can thrive only in environments that have been artificially stripped of the normally dominant microorganisms that humans are mostly well adapted to. Instead of fighting an ever-escalating arms race against resistant microorganisms in some imaginary “war on germs,” maybe we ought to stop screwing with the normal microbial environment and start learning from it instead.

I mean, is it really so surprising that when you salt sliced cabbage and let it sit in a crock for a few weeks – no preservatives, no antibiotics – that what you end up with, after the ambient microorganisms have had their way, is not only perfectly preserved cabbage but also delicious? That fermented foods naturally arise when you let traditional foods “go bad” and that they taste so good ought to suggest something. It’s almost as if millions of years of natural selection in an environment devoid of modern preservatives and antimicrobial agents has made humans well adapted to (and even crave) the microbiota that naturally dominate the places where humans have traditionally lived and the foods that humans have traditionally eaten.

Taking us out of those environments makes our bodies and foods and hospitals competiton-free zones for modern “superbugs.” Seems like a really bad idea.


It's all well and good until you're immunocompromised. S. aureus itself is commensal. Ditto E. faecium, K. pneumoniae and Enterobacter. Meanwhile A baumanii and P aeruginosa are widespread in soil. Together these represent all of the ESKAPE pathogens, which are the primary superbugs: Enterococcus faecium, Staphylococcus aureus, Acinetobacter baumanii, Pseudomonas aeruginosa, and Enterobacter [all Enterobacter]. That is to say: the six bacterial strains [excluding Mycobacterium tuberculosis, guess what it causes] carrying the most severe antibiotic resistance and responsible for the worst infections are the very same strains present all around us naturally. However, in every case except S aureus, infection is basically unheard of when an individual has a functioning immune system; they are all opportunistic pathogens. This is a part of the problem: a healthy individual can carry a dangerous strain around with no idea of the risk they present to the immunodeficient.

So what is to be done with the immunocompromised? There are some heroic medicine case studies of an "immune system transplant", notably the remission of HIV, and perhaps this method could be made more viable.


Don't you suspect that a large portion of "a functioning immune system" is the naturally occurring human microbiome (e.g., especially in the gut)? So isn't any rational response to the question What is to be done with the immunocompromised?, in part, to stop compromising people's immune systems by screwing up their microbiomes?

Yes, people become immunocompromised in other ways, but I'd wager that a large portion of compromised immune function in modern society is caused by our screwing around with antibiotics and other antimicrobial agents on a large scale without understanding what the hell we're doing. The "germs are bad" model of medicine is overly simplistic and gets important things wrong and, as a result, is long-term harmful.

Until mainstream practice catches up with the reality that our environment, including our microbiota, are part of the properly functioning human organism, we're going to cause a lot new problems for ourselves by trying to fix the "old" problems.


Errm, you are aware that people got these infections, and died in droves from them, long, long before the age of antibiotics, or "germs exist", let alone are "bad"?

This is a grander version of the hygiene hypothesis (https://en.wikipedia.org/wiki/Hygiene_hypothesis), but unlike it I don't see a phenomena to explain.


Who is claiming that microorganism-involved disease was nonexistent before the "war on germs"? My claim is that in light of what is now known about the human microbiome and its role in immune function, many common practices for "fighting germs" can be predicted to have harmful effects, particularly in the long run.

Do you have any reason to believe otherwise?


Do you think these practices are that effective? One of scythe's points is that the ESKAPE bacteria are ubiquitous, in people and the environment.

Sure, taking an antibiotic deranges your microbiome, but doesn't it seem to most of the time readjust?

My specific point is exactly what observed phenomena are you claiming this explains? Until you have that, or a solid, specific hypothesis leading to bad specific things, I suspect the rest of us will continue to use antibiotics when we get otherwise likely lethal infections. I brought up "microorganism-involved disease" because I'm not aware of any significant changes in how they work, play out in the human population, etc., besides the obvious.


Certainly, if you have an "otherwise likely lethal infection," take the antibiotics. If they screw up your immune response down the road, it's still better than dying now. I need to be clear that I'm not talking about this specific case but the general practice of fighting what are believed to be "bad germs" using techniques that can be expected to screw up the human microbiome and, as a result, weaken the body's immune function, making it more susceptible to exploitation by opportunistic pathogens.

As scythe wrote,

> That is to say: the six bacterial strains ... carrying the most severe antibiotic resistance and responsible for the worst infections are the very same strains present all around us naturally. However, in every case except S aureus, infection is basically unheard of when an individual has a functioning immune system; they are all opportunistic pathogens. [Emphasis mine]

Right now, the emphasis in medical practice is on attacking the "bad germs." What is mostly ignored is cultivating robust immune function, which as scythe points out would mostly prevent opportunistic pathogens from doing harm in the first place. That's probably because until recently nobody knew much about improving immune function. But now, with a better understanding of the human microbiome, it's becoming hard to ignore that the techniques used to fight the bad germs (e.g., broad-spectrum antibiotics) are probably also having harmful effects on the human microbiome and, as a consequence, immune function. So the short-term fix is likely to also cause long-term problems.

To give you an example of what I'm concerned about, let me share what happened to my wife a few months ago. She had a rash on her lower leg, near her ankle, and went to see her general practitioner. Her GP didn't know what the rash was but nevertheless prescribed an anti-itch cream and -- surprise! -- a course of antibiotics, "just in case." (It turned out that the rash was mild poison ivy, so the antibiotics would have done nothing to help.) For many practicing doctors, broad-spectrum antibiotics are used like this as a general prophylactic for all manner of minor ailments. Lots of people are having their microbiomes decimated for very little benefit.

> Sure, taking an antibiotic deranges your microbiome, but doesn't it seem to most of the time readjust?

By what mechanism would eliminated strains be repopulated? And if they weren't repopulated and if your microbiome didn't "readjust," how could you tell? After all, even if such derangements were permanent, we would not expect to observe them until they were challenged by opportunistic pathogens and exploited. It could be, therefore, that such derangements have an invisible cumulative effect: the more screwed up your microbiome becomes the more at risk you are, but unaware of it until you get sick.


The medical community is only a small part of the problem. Patients demand antibiotics, people don't finish their charted courses of drugs or take them incorrectly and TVs yell at you to sterilize things, buy harsher drugs etc. What are the first things relatives do when they go to see a relative in a hospital? Give them a kiss, pat on the arm, touch stuff etc. It's not so much a medical problem as a human nature problem. I disagree that we have made the world less safe, do you have a source showing that human deaths from infection are rising or higher than pre-antibiotics? MRSA death numbers (but not death rates) have been falling according to the below link. http://m.guardian.co.uk/news/datablog/2012/aug/22/mrsa-relat...


This looks like a viable research route? New pathway for eliminating bacteria thats viral instead of antibiotic (fungal?) based:

http://www.sciencedaily.com/releases/2012/07/120724104640.ht...

Can anyone speak to the validity or promise of such research?


This is actually akin to a antibiotic, it's a protein, an enzyme that a bacterophage virus produces to attack bacteria. See a bit more here: http://en.wikipedia.org/wiki/Lysin Also see all the antibiotics that are peptides in the Wikipedia list: http://en.wikipedia.org/wiki/List_of_antibiotics A peptide is basically a small/short protein, by convention less than 50 amino acids.

One issue which is noted in the Wikipedia article is that it's subject to attack by the immune system. Maybe that won't be a big problem, or it can be gimmicked, but it's still a big protein, would require topical or IV instead of oral dosing, and it's not going to be able to get to a variety of places that much smaller mold, indeed fungal produced antibiotics can get into.


I'm curious as to why bacteria didn't evolve to resist Lysin?


Well, perhaps some have (haven't looked at this closely), however, since in nature the various types of it are only used by specific bacteriophages, i.e. as I understand it they are or tend to be species specific, using a particular one against other species of bacteria should be fruitful.

Then again, interspecies transfer of resistance through e.g. plasmids happens.


I was relieved the map showed Israel as a hub instead of India (which was cited as the "source"). if you think Asiana scored it for "culturalism" then this here could win the whole game. First disclosure: not an Indian.Second disclosure: before clicking on the link I was expecting India or Africa. You know, some tropical third way place? Maybe I just have the wrong kind of paranoias. Bill Gates really needs to use his stature in India for this. If anything is hard object versus immovable force, this is it.


I follow this issue to some degree and unfortunately have to say this article suggests more that it's a product of Britain's infamous and ancient Antisemitism than something that reflects reality.

Or more precisely, it's politically necessary to not blame India, since that country's reaction to the discovery of the terrifying New Delhi metallo-β-lactamase (the first β-lactamase to incorporate a metal ion, it's really good at what it does) was terribly counterproductive. See http://en.wikipedia.org/wiki/New_Delhi_metallo-beta-lactamas... for the basic story.

And I'm not at all surprised that by slight of hand they moved as much blame to Israel as they could (what's in the article doesn't really support it).


> Britain's infamous and ancient Antisemitism

I'm British and it's not infamous enough for me to have heard of it.


If you can't see it in your country's current actions, I don't know what to say.

For the historical, start with e.g. this in 1144 http://en.wikipedia.org/wiki/Blood_libel#Middle_Ages and for more recent, read up on how at almost every turn Churchill's efforts to help Jews during WWII were thwarted by those under him.


I don't deny that there is bigotry in Britain, including bigotry against Jews. (Of course there is: there is bigotry everywhere).

What I do question is that there was/is more of it in Britain than in other countries. For example, the part of Britain I live in (Scotland) is the only European country never to have had state persecution of Jews. (https://en.wikipedia.org/wiki/History_of_the_Jews_in_Scotlan...)

> for more recent, read up on how at almost every turn Churchill's efforts to help Jews during WWII were thwarted by those under him.

If someone asked me to name a country that was "infamous" for anti-semitism during WW2, I doubt if Britain is the one that would spring to mind. Nor would it for most historians.


Ah, I know nothing about Scottish Antisemitism or a lack thereof, and I wouldn't be surprised to know it's different than England.

Your latter point is a bit of a cheap shot; of course we all know which country was "infamous" in WWII; what's interesting is the counties other than that one. E.g. look at the percentage of Jews who survived in occupied ones, factoring in the differences in the occupations. And which ones accepted refugees and to what extent. Then there's the war policies, and there to a very great extent the U.K. called the shots in the Western European Theater. That's where I'm suggesting one look ... although of course it's not something that can be quantified, seeing as it was the only combatant after the fall of France.


> Your latter point is a bit of a cheap shot; of course we all know which country was "infamous" in WWII

Indeed. But your phrase was "Britain's infamous antisemitism". I was pointing out that British antisemitism during WW2 was in fact not "infamous". Indeed, if the worst thing that Britain did during WW2 wasn't harming Jews but not helping them as much as some people in the UK government would have liked, then they couldn't really have been very anti-semitic, could they?

In Britain, it is illegal to discriminate against someone because they are Jewish, and has been for several decades. Furthermore, of Britain's two largest political parties, one has a Jewish leader and the other did from 2003-2005; it is unlikely that either party would have done this if it thought a Jewish leader would be an electoral liability, or if their rank-and-file members disliked Jews. Are these the actions of a country "infamous" for antisemitism?


Israel is not the hub.

It's just that Israel is technologically advanced enough to actually track these bugs so there is data to map.

Other countries (India) are much larger hubs, but they don't check for it so there is no data to use.


"The [Indian] Union Health Ministry is considering a new National Antibiotics Policy for the country to handle increasing antibiotics resistance in the country."

News Article dated 27 July, 2013: http://www.thehindu.com/sci-tech/health/policy-and-issues/ne...


Once people start to die in droves the obstacles to a cure will vanish. Hopefully the people who can discover and manufacture the cure won't vanish first.


I have come across several doctors that have tried prescribing antibiotics for preventative reasons to either myself or my family. How sad that it is the people who understand this threat the most are the ones causing it.

I rarely resort to finger wagging and shaming for anything, but for antibiotics misuse, not only do I feel it is appropriate, but I encourage others to do the same. Veterinarians should get the same treatment, IMO.


If it gets bad enough, outpatient antibiotic use will be a thing of the past: If you need antibiotics, you will be confined somewhere and your compliance will be monitored. Laws will be passed to make non-compliance not simply a crime, but effectively impossible. If worst comes to worst, the facilities will be quarantined. There will be incinerators on site.

So, no, I don't worry about 'society'. Society has ways of defending itself. However, when the notion of individual rights above all starts to bring about an existential threat, something has to give.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: