Hacker News new | past | comments | ask | show | jobs | submit login

> Your exome changes all the time.

I work in DNA analysis (not for 23andme). Citation needed.

> Your gut biome changes all the time.

Your gut biome isn't being tested by a genetics company collecting your saliva or epithelial cells.




How complete are theses tests? If I wait another 10, 15 years to get sequenced, would I get more accurate, reliable information? Can advances in sequencing technology make it worth people to get resampled? Or is it just an issue of better analysis?


The technology itself is constantly improving, for instance the push towards long read technologies. Beyond the underlying sequencing, the computational pipelines run on the output from the sequencers is also constantly improving. A huge issue at the moment is in that space, as key portions of the alignment and calling pieces of those pipelines are not optimized for a global population.


Couple of professor's and top researchers are stated that the tests are accurate as a horoscope.

I also did a "highly respected testing using thousands of studies" 650$ testing which, well... I can guess the same stuff by looking at someone or two seconds.


> How complete are theses tests?

I can't answer that.

> If I wait another 10, 15 years to get sequenced, would I get more accurate, reliable information?

If you're asking about the DNA data itself? Yes, absolutely. The underlying sequencing technology is advancing every year.

If you're asking about the value-added informational inferences sold to consumers? I wouldn't put much faith in that, but I'm also a cynic when it comes to computers and business relationships with consumers.

@mathnerd314's comment [0] is correct, current DNA analysis products frequently produce conflicting results [1]. I've seen the software side for why that can happen. The technology is extremely complicated and sometimes convoluted. And there are many "standards" (xkcd-style [2]) for reporting even more ad-hoc features. Big companies can't even get basic web services to work right 100% of the time, what makes you think DNA analysis software can do that?

There's a lot of math which goes into consumer DNA products. The math involves a lot of statistics and population information. The math can be biased towards or against particular people. And it's usually hidden behind closed-source algorithms and proprietary information. Do you trust that?

I strongly think this is an area where international standards organizations should come into play and where laws should be put in place to govern data correctness, responsibility, and culpability. The NIH is doing some work towards that end (eg, precisionFDA challenges) but it's doing a lot more science-related work instead of consumer-related work IMO. As-is, consumer sequencing analysis products (especially in the United States) are in the wild west.

Without strong standards and laws I don't think you're guaranteed to get more accurate or reliable information in 10 to 15 years. You'll get the same kind of information you can get today, but perhaps cheaper.

> Can advances in sequencing technology make it worth people to get resampled?

Yes. The sequencing products that I have worked on have gone through multiple generations during my (so far) six years of employment in the industry. Each product's generation changes the sequencing technology itself, the analysis product, or sometimes even both.

> Or is it just an issue of better analysis?

Current technology has a lot of trouble sequencing completely through a full strand of DNA. If you can solve that then analysis will instantly get better. To get around that, there are many different types of sequencing available.

The micro-array that's used by most consumer companies focuses on hundreds of thousands of single-basepair locations and do so quite affordably at industrial scale. If you have any sort of unique variation at one of the targeted locations then micro-array sequencing technology will likely have trouble getting data about that location. But en mass, one data point isn't enough to throw off your results.

Another type of sequencing works with chunks of DNA - partial strands - but sequencing quality goes down with the length of the chunk. One chromosome will have millions of basepairs split into thousands of chunks but the maximum length of the chunks for usable data is measured in hundreds or maybe thousands of basepairs. To help solve that, the sequencing technology will make many hundreds of duplicates of those chunks and sequence all of them simultaneously in a massive parallel operation. Even though the sequencing itself is slow, the parallel part makes it high throughput. Then the low quality of each individual read is offset by high numbers of copies and reassembly is just a (very) computationally expensive task.

If you can improve the quality delta across a DNA fragment, particularly to the millions of basepair lengths, then that would be a game-changing breakthrough, especially if you can keep throughput high. I intuitively know it's possible: your own cells copy your DNA every time cell division occurs, and they do it extremely fast compared to current sequencing technology. I'm just a software dude though.

There are other sequencing technologies which I've not worked on and can't comment about.

[0] https://news.ycombinator.com/item?id=22130359

[1] https://www.healthista.com/dna-genetic-test-review-three-dif...

[2] https://xkcd.com/927/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: