Hacker News new | past | comments | ask | show | jobs | submit login

  > That might be, shall we say, a somewhat short term 
  problem as the population is culled of less hardy individuals
You know what's the real problem? The culling will most likely happen in areas that people have no experience with fighting bacterias on their own i.e. the most developed parts of world, that means that a LOT of progress will be lost. Those that read this site are those most likely to be culled.

Anyway silly question. If the bacterias are so well at adapting to antibiotics, can't we create some kind of mold that can adapt to the bacteria? E.g. take bacteria A spread it to N + 1 Paetri dishes, put some mold in N dishes and select which is better. Then take the last remaining dish and create a new set of N+1 bacteria dishes.




Bacteriophages are pretty good at this. And used in some countries.


The problem with bacteriophages is that they require identifying the specific infection, and then identifying a bacteriophage that is suitable for just that infection.

In comparison, part of the reason we're in this situation in the first place is because doctors can't even be bothered to figure out which antibiotic is suitable, or whether or not you actually have a bacteria infection, but often just prescribes first-line antibiotics and tells you to come back if you're still ill afterwards. That means 5-10 minutes per patient many places.

If we were to try to switch to bacteriophages, on the other hand, it'd mean blood tests and lab work to identify the specific infection and strain for every patient, with resulting massive cost increases.

In it's current state, it's a last way out, but it's far away from being a viable large scale replacement for antibiotics.


In plenty of cases doing lab work is impractical, and doing it the old fashioned way that makes sure takes more than a day; plenty of times doctors don't have that long. Plus there's the competitive pressure, people fire doctors who don't give them the antibiotics they demand.

WRT to the first problem, starting in 1995 I got recurrent sinus infections (which are now mostly prevented with a couple of nasal sprays). A sinus biopsy is not practical, instead my doctor, knowing what locally worked, gave me one particular antibiotic. Later a standard formulation of that plus an anti-β-lactamase drug. Then the one that still works (a different β-lactam family), but I need 30 days of it.

When I was unemployed and only had catastrophic medical coverage I got a full course from samples of a new, lipophillic drug that builds up a concentration in your fat and you therefore don't need to take it for so long, and the constant dose no doubt helps. But that one failed for the next sinus infection and I was back to the 30 day one....

Errm, when it's pretty clear the patient has a bacterial infection, why isn't it often a best practice to give a first line drug while doing a culture? Strep throat is the classic example I can think of from when I and my siblings got that in the '70s ... of course, although I don't know if it was realized by then, giving an antibiotic for it is mostly prophylactic, to avoid rheumatic fever. I have a very good friend who's about a decade older than me, when she got that her mother completely freaked out because prior to right then it was a death sentence. Fortunately penicillin was just making it out into civilian use....


Oh, absolutely, I didn't mean to suggest it's practical to always do lab tests, though I can see how it'd seem that way. It's an unfortunate and at least partially necessary tradeoff not just because it's impractical to do lab work for every patient, but also prohibitively expensive.

I meant it largely to contrast it with bacteriophages, where we don't get the choice. Antibiotics "won out" to a large extent because alternatives like bacteriophages are tremendously more expensive and complicated compared to the practice of handing out antibiotics without blood work. If we took all available precautions to prevent resistance, then we'd face much reduced risks of antibiotics resistance, but the cost gap would reduce accordingly as well.

Antibiotics are so important exactly for the reasons that give us resistance: It is cheap, easy and fast for all but the unfortunate cases that run into resistance.

> Errm, when it's pretty clear the patient has a bacterial infection, why isn't it often a best practice to give a first line drug while doing a culture?

It often may be. The problem are the cases where antibiotics are given when it isn't even clear if its a bacterial infection - in the UK for example doctors will often give antibiotics "just in case". The second problem is that this means that antibiotics will quite often also be given in situations where the bacteria in question will not respond to the treatment, or will survive in sufficient number to cause the infection to rebound and leave the patient walking around potentially spreading an infection with increased resistance to the first line antibiotics.

Again, it is a tradeoff that sometimes is necessary, and very often the best recourse for that specific patient. But that doesn't mean it doesn't contribute to increasing overall levels of antibiotics resistance and that some of these decisions negatively affect society as a whole.

Incidentally if/when we can find technologies to speed up precise tests for specific strains, bacteriophages might become suitable for more situations and we might be able to limit use of antibiotics for the situations where we can't risk waiting.


Don't they have to be stealthy enough to avoid immuno detection but not stealthy enough to mutate into a virus that bypass human immune system?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: