Hacker News new | past | comments | ask | show | jobs | submit login

> To calculate mortality in the three countries, where death registry statistics are unreliable, the researchers employed tens of thousands of community reporters—more than 14,000 of them in Kenya alone—to conduct household surveys of childhood deaths in 79 areas where the RTS,S vaccine was administered and 79 comparator areas where it was not available.

That doesn't sound remotely close to reliable.

> The mortality benefit was documented even in the areas with the lowest RTS,S coverage

The report doesn't provide any actual data, so it's impossible to make sense of this statement.




13% is a large effect size, you'd have to presuppose a systematic bias in the unreliability between the comparator districts. Also, this is the case where research needs to be conducted but reality makes it hard but you can't just +not+ do the research. Field tests are always messy, and I'd posit that community reporting is probably more reliable thn politically motivated reporting from some governments.


> 13% is a large effect size,

You can't just say that. Depends on what you're measuring, how you measure it, and how big the denominator is.

The fact that they saw the same effect size in groups that didn't get the vaccine is a reason to doubt the results, regardless of effect size. It was weird/credulous that they called it out as some kind of mysterious woo-woo advantage ("maybe it helps their immune system somehow!"). When you see stuff like that in a paper, it makes you scrutinize the results. When you hear it in a conference talk, it makes you reserve judgment until you see the paper.


> The fact that they saw the same effect size in groups that didn't get the vaccine is a reason to doubt the result

Where does it say that in the article? The closest it gets is "The mortality benefit was documented even in the areas with the lowest RTS,S coverage" but the lowest coverage was 62% (the highest was 75%).


Yep, but then they go on to make the credulous woo-woo argument. None of us know what the actual results are here (since this is a conference talk, translated by a reporter), so I can only work with what they say.

In any well-done study with an effective treatment, you'd expect, a priori, that a reduction in intervention produces a reduction in effect. In other words, you don't benefit if you don't get the drug.


The article is actually silent on whether the mortality effect was reduced, or whether the study was powerful enough to reliably detect that the mortality effect was reduced in lower-vaccinated communities. It just says the effect "was documented" in those communities, not whether it was exactly as strong.


Randomised placebo controlled trials are the only way to determine the reliability of a product.

The manufacturers know this. When they are doing something different it is not a good sign.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: