Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Fun post, but I don't think it supports its own conclusion. Given the tangle of dependancies involved in a properly functioning human, why put focused effort into patching something as easily remediated as the inability to synthesize Vitamin C, given the opportunity for unforeseen side effects?

I think he just doesn't want to eat his applesauce like a good wittle man.



Why would we expect patching vitamin C to be hard or come with consequences? As pointed out, it's trivial enough that almost all animals do it.


We don't know whether any other genes have adapted in the last 10 million years to the lack of the pathway needed to synthesize vitamin C. Reintroducing the proteins for this pathway may cause unintended interactions with existing proteins.

Imagine a large software system that had a small mobile website. New changes to the code must be compatible with this mobile website and can't break it. Now imagine that one day this mobile website was taken out. Development continues but since there's no need to worry about the mobile site numerous incompatible changes creep in over time.

Now suddenly new management comes in and decides that the mobile site should be brought back. Clearly, as soon as they try to put it back in they run into every incompatible change ever made to the code and the whole thing becomes highly unstable.

So that's what (might) have happened in the body. With no worrying about whether a new mutation might negatively interact with the vitamin C pathway that new mutation can proliferate freely, especially if it is beneficial. All reverse-compatibility is lost.


In theory it might be possible to test this experimentally. Rather than genetically modify them from the start, synthesize the protein in question and try to get it into cell cultures. This should make it possible to get an idea of what will happen in that situation, e.g. will it kill the cells or cause anything obviously wrong. After that test it on some animals with similar Vitamin C problems. Shouldn't be impossible to test this, after the animals test it in adults to find out. etc.


"Why would we expect patching vitamin C to be hard or come with consequences?"

Because most such things are hard, and come with consequences?

Hell, even erectile dysfunction drugs have consequences ("if erection lasts more than four hours, seek medical treatment"), and we found those by accident.


So we are talking about modifying the human organism at birth. There are a couple of things to consider if you start doing this.

1. If you correct the enzyme deficiency, you will create a new genetic disease caused by a deficiency in that enzyme. You won't be able to successfully correct it in everyone, and even if you do, there will still be children born with the deficiency due to random variation. (Given that the deficiency isn't detrimental in our society otherwise we would all have scurvy, there is no selection pressure to remove it from the gene pool, so there may actually end up being a fairly large number of people with this deficiency unless you keep gene editing every child.) Now, you have an ethical obligation to find these deficeint kids who would have been 'normal' 5 years ago and look after them, and make sure they eat enough vegetables. This means once you start correcting the gene deficiency, you have to screen every baby to make sure they are corrected.

2. It is very difficult to prove that correcting this enzyme is not somehow dangerous, especially since nobody gets scurvy apart from the occasional university student. The point here is that there is almost nothing to gain by correcting it in a society which has the money and technology to actually do it. You therefore must go to great lengths to prove that correcting this deficiency is safe. This will take probably 2 generations of humanity and several hundred thousand people from all ethnicities. Looking at the biochemical pathway someone has drawn on a whiteboard and saying "It should be ok." isn't good enough.

I would also say that I have never heard anyone who actually studies biology complain that the designs are 'dumb'. Filled with wonder, perplexed, humbled etc would be more common responses...


Your #1 is possibly the most ridiculous thing I have read today. So by eliminating even the possibility of scurvy from ~100% of society, that will somehow increase the potential?

No. There would be no increase. We would not have to worry about these kids. Why? BECAUSE VITAMIN C IN FOOD WOULD BE UNAFFECTED. What, in this bizarro world does the addition of vitamin C magically cause vitamin C to disappear from all meats and plants? Of course not! These oh so unfortunate kids that we have an 'ethical obligation' to will be in exactly the same situation the entire current global population is in: getting vitamin C as normal from their diet. Screen every baby? Absurd. This sort of intervention is fire and forget: it's insurance against the rare scurvy-circumstances.

Point 2, on the other hand, is a good point. You should have solely responded with that - that because scurvy is so rare, the benefit is small, and even small risks of the intervention going wrong outweigh the benefit.


No need for the caps, friend.

(First of all, there isn't much vitamin C in meat, unless you eat liver.)

Obviously vitamin C in food would be unaffected, but the way people eat may well be if they think they are vitamin C proofed. Vitamin C is added to everything these days, and this would likely stop happening in a post-patch world when its value as a health gimmick diminishes. Intake of fruit and vegetables is already poor.

So although it is likely there will be a reduction in scurvy cases initially, there may be a resurgence when changes in dietary habits/food production occur and are misapplied in people who are not patched.

Post-patch this becomes a problem we created, inadvertently or not. I disagree that this situation is ethically OK.

There are numerous examples of paradoxical/unintended trends in public health:

1. Increasing incidence of HIV infection in developed countries in some communities, possibly due the notion that the disease is now easier to treat.

2. Outbreaks of polio after apparent eradication due to isolated anti-vaccine communities.

3. Increasing numbers of multi-resistant bacteria and even new infections arising from widespread indiscriminate antibiotic use (by doctors and farmers).

4. Patients who develop a false sense of security after a negative test for a mutation predisposing to breast cancer, then don't participate in routine screening (which is why anyone who has such a test should receive professional genetic counselling).

I don't think there are any fire and forget interventions where human health is concerned...


> why put focused effort into patching something as easily remediated as the inability to synthesize Vitamin C, given the opportunity for unforeseen side effects?

I think you're missing the point: Why should there be a huge opportunity for unforeseen side effects?

If you were grading an undergraduate programming project with the property that fixing an obvious bug makes who-knows-what else break, what grade would you assign?

Further, what does that observation do to the idea that "Natural is better than human-made"?

> I think he just doesn't want to eat his applesauce like a good wittle man.

This comes off as really, really condescending. How does it add to the discussion?


"If you were grading an undergraduate programming project with the property that fixing an obvious bug makes who-knows-what else break, what grade would you assign?"

This is not about homework. If you discovered a bug in some basic function that changes a few globals in the software for a nuclear facility that had been running for a few decades, would you fix it without doing a thorough check of the places where the globals get used (or even at all)?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: