“Science is a noble endeavor, but it’s also a low-yield endeavor,” he says. “I’m not sure that more than a very small percentage of medical research is ever likely to lead to major improvements in clinical outcomes and quality of life. We should be very comfortable with that fact.”
A competent medicinal chemist can spend a lifetime in drug discovery and never get a candidate molecule out of clinical trials. I learned this from the excellent Derek Lowe, whose blog at http://blogs.sciencemag.org/pipeline/ is on my daily list.
Quote from an interview[1]: "I’ve been doing this for 27 years, and I have never once put a drug into a pharmacy. I tell people: “If you want to know why your prescriptions cost so much, it’s me.” I’ve done nothing but spend money the entire time."
That's a widely criticized study because it ignores the cost of failure. It only looks at the r&d investments of companies that received FDA approval for a drug. It's like doing a study of the odds of winning the lottery, and only analyzing people who won the lottery
Considering the fact that 90% of phase 1 drugs never get approved and that study seems quite biased. That study conveniently assumes that all those medicinal chemists, who've worked decades and never seen a drug approved, simply don't exist
Interestingly this contributes to why drug companies spend so much money on advertising: it has a predictable measurable return, whereas R&D is virtually a lottery.
Controversially, I think we need to do less research. The publish-or-perish culture has created a perverse incentive to crank out junk papers. Most working scientists will privately admit that most research isn't actually advancing our understanding of nature, it's just a desperate effort to dredge up something sufficiently novel to publish. Conversely, there's a substantial amount of research that is potentially useful to clinicians, but is languishing unread in some third-tier journal. Most research is never published at all because it supports the null hypothesis, but we can't do good meta-analyses based on a cherry-picked set of studies. We're glutted with data, but we have a remarkable paucity of actionable information getting to the people who need it.
The problem isn't that scientists are doing a bad job, but that the funding mechanisms of science incentivise the wrong kind of work. We should be focusing a far larger proportion of our funding - perhaps even a majority - on replication, meta-analysis and dissemination. Primary research is only one small part of the information architecture of science, but it dominates our spending.
In the case of nutrition, we're spending huge amounts of money on figuring out whether coffee increases or decreases your life expectancy by a fraction of a percent, but almost nothing on behavioural research to figure out how to stop people from gorging themselves to death on food they know to be terrible for them. There's a morbidly obese elephant in the room, but we're preoccupied by the micronutrient-rich mice scuttling around the periphery.
I think a lot of information economies suffer from oversupply. I've heard it said that there are too many books, too much music, too many different open source projects trying to solve the same problems, and so on. It causes information overload on the demand side and perhaps paradoxically increases the odds of something genuinely important being neglected.
>but almost nothing on behavioural research to figure out how to stop people from gorging themselves to death on food they know to be terrible for them.
I'd also say the converse is true. Corporations spend lot on how to get you to eat their low cost high profit food that is not good for you. Think if we treated junk food like cigarettes in many countries where they had to be in a generic white box and no advertising.
Junk or not, if someone's daily caloric intake is in the sky, they can overeat themselves on the most non-junkiest of foods. Let's say a lot of vegtable seeds, superhealthy all organic bread, some premium right out of the cow diary stuff, and some fruits, just to get more sugar.
I don't think there's a shortcut. There would have to be a massive change in approach. In my field of expertise, molecular biology, it;'s common to just run experiments over and over until you get a positive result- throwing away thousands of dollars in gels and other things along the way. This isn't even intellectually honest (it's fishing for significance) but it's what nearly everybody does to get publishable results.
As a response, I work at a company that is automating biology, with a goal of making much more reliable clean data for machine learning, but even then, it's very expensive. A decent robot arm costs well over $50K (I could build an equivalent for $1-2K, but at a cost of $100K of my time) and all the other equipment is often in the $100K+ range. Just to automate what you could hire a human to do for $75-100K a year!
So yeah, some scientists are doing it wrong, but even the folks doing it right are still wasteful. I think it's endemic to the enterprise, but we could still do better.
There's not really a shortcut to the fact that science is difficult.
But more quality science and less garbage science would help. Doing 10 studies with a small likelyhood of any of them having a true result is less helpful than 1 high quality study that almost certainly is true.
As someone with a degree in Nutritional Science who has since left the field for actual science (Biochemistry), I couldn't agree more. It was reasons like this that I left.
The field was corrupted early on by the sugar lobby and it took an immense amount of effort to simply make it ok to admit saturated fats weren't bad for you again. Some 50 years of research negated by a deal that went down with a very small subset of people.
The Academy of Nutritionists and Dietetics is a complete and total waste of human effort. I've never encountered a more useless amalgamation of self-serving pretend professionals in one organization. Everything they do is to protect their organization and reduce entry. To even become a dietitian, after graduating from a didactic pathway you must do a 1-2 year paid internship. And by paid, I mean it costs you $20,000-$40,0000, you work 40+ hours a week, including unsupervised work, which makes the legality dubious, and in exchange you sometimes (but no always) get to take a few credits of useless grad level classes that won't get you any further to a master's degree. And then when you finally get certified, after an $80,000 degree and $40,000 internship, you get to start a job that pays about $55,000/year peak, assuming you aren't some world renowned specialist.
The people involved are just the worst. All play professionals who obsess over trivial details like professional attire and pointless faux psychology, like learning and motivational theory. They treat everyone like stupid children and design the absolute worst literature and outreach, catering to the lowest possible denominator with patronizing and condescending advice and horrible rap songs designed to get children to eat vegetables. Meanwhile this has left a vacuum of knowledge for anyone with two brain cells to rub together, and so any reasonably intelligent adult instead gets sucked in by even worse pseudo-science blogs, a phenomenon that was created and enabled by the Academy.
The kicker for me was when I took a community nutrition class that the majority of my classmates revealed that they believed in "health at every size" and that a registered dietitians primary function is respectful communication, (read: participating in the delusions of patients) not communicating important information and truths that could impact the patients health.
That's not even getting into the absolute sham that is most nutrition research and how much of it is funded by organizations who have a predetermined outcome and just need to find a way to fit the facts to what they want. Nutrition research is absolutely rampant with activist organizations trying to coopt the field to further their related agendas.
Nutrition Science, ah yes. What would your reaction would be if I told you that we claim that the interaction of seven billion human beings to many thousands of previously living organisms will be the same for each of those human beings? That's obviously nonsense. In other words, you can't just say consuming eggs does X to everyone. You need to partition foodstuff AND partition humans, the partitioning likely based on genetics but also probably other factors as well and then you can describe how each partition reacts. Those other factors likely include previous reaction of food intake... It's a gigantic mess. Until we know a lot more, I will take every nutrition advice as if a dead cat wrote them. https://en.wikipedia.org/wiki/List_of_animals_with_fraudulen...
It's almost as if different human populations separated by insurmountable geographic barriers in wildly differing ecological circumstances for myriad millennia underwent selection pressure!
Coming from a biology background, this is the exact sort of catch-all statement I really despise.
Is there purity? No, not in the absolute sense. However, we're not some homogeneous mix either. Genetic, environmental and other such patterns do form a number different subgroups which while imperfect, remain useful.
Purity is a bit of a misnomer because a "perfectly" homogeneous species with no distinctive heterogeneous groups would have to be called pure, but that's not usually how people use the word. The implication is usually that there are several distinct, homogeneous groups within which there is homogeneity among its members.
Having homogeneity at some level, even if it ends up being in the form of several homogeneous groups, allows one to make useful scientific findings and generalizations. On the other hand, if everything was very heterogeneous, down to each individual, it would be very hard to find a general statement applicable to any appreciable number of individuals.
> homogeneous groups, allows one to make useful scientific findings and generalizations. On the other hand, if everything was very heterogeneous, down to each individual, it would be very hard to find a general statement applicable to any appreciable number of individuals.
and yet it is possible we will need to end up where individual advice is the only one working
For some phenomena, that might turn out to be true. However, counterexamples are fairly prominent and obvious, e.g. there is a very large subset of humans (~all) for which antibiotics work well for almost all bacterial diseases.
About time. This is the type of crap that gave us "eggs are the equivalent of smoking cigarettes", "red meat causes cancer", "Coconut oil is pure poison" and many more...
“The World Cancer Research Fund report Food, Nutrition, Physical Activity and the Prevention of Cancer provided advice on red and processed meat in 2007.
The organisation said the evidence that red and processed meats are causes of bowel cancer is convincing. It advises that people eat no more than 500g of red meat a week (around 70g a day) and avoid processed meats.”
So how much did they weight the utility of eating red meat in order to conclude that severely restricting it would be worth the .074% decrease in death risk? Because that doesn't seem like a significant risk.
Genuinely curious, where did you see the "0.074% decrease in death risk?"
From the page you linked to:
"13% of bowel cancer cases in the UK are caused by eating processed meat.
Bowel cancer risk is 17-30% higher per 100-120g/day of red meat intake, meta-analyses have shown.
Bowel cancer risk is 9-50% higher per 25-50g/day of processed meat intake, meta-analyses have shown; however bowel cancer risk was not associated with processed meat intake in a pooled analysis of UK case-control studies. Colon cancer risk is 12% higher per 1mg/day of haem iron intake, a meta-analysis showed; though bowel cancer risk was not associated with dietary iron intake in a pooled analysis of UK cohort studies.
Serrated bowel polyp risk is 23% higher in people with the highest versus lowest levels of red meat intake, a meta-analysis showed."
Ok let's run this scenario (please check my math here, I'm not positive I'm doing the probability right). Let's assume we're talking about a man, bc men are more likely to be affected. Your lifetime risk of developing bowel cancer is 7% for males [1]. Bowel cancer risk is 17-30% higher if you eat a quarter pound of red meat per day [2]. But let's go ahead and assume someone does eat that much meat and use 30%. If you get bowel cancer, you have a 25% chance of dying [3]. That means...
I believe that your math is wrong because some of your numbers are percents and some are not, and because you divided instead of multiplying. If you are interested in the chance of death, you take the chance of cancer and multiply it by the chance of death given cancer.
So by this math, eating a quarter pound of red meat a day causes men in this worse case scenario to die 0.525% more often, which is a sixteen times higher rate than your calculated value.
Of course, that’s not how to actually do the relevant calculations, since (assuming no significant technology changes), everyone will eventually die. Thus you instead need to do calculations based on life expectancy, or better yet on QALY (quality adjusted life years), which adjusts for the quality of life that people live in addition to their age.
>Genuinely curious, where did you see the "0.074% decrease in death risk?"
I was generously assuming that if I follow their advice, I eliminate bowel cancer risk entirely. Since only .074% of the population has it (74 per 100k), that’s the max change in risk of death.
But you’re right, I really only decreased my risk by 13% of that .074% by eliminating red meat. Still don’t see the enormous upside.
But they only appeal to the bowel cancer risk and don't list any other death risks that are associated with red meat, so I'm assuming that's the only (or the far-dominant) one.
Every single one of those recommendations as of 2007 was the result of correlational epidemiological studies. It cannot be emphasized enough that correlation is not capable of proving causation. Period. All that correlation (or the lack thereof) is capable of doing is disconfirming hypotheses, not confirming them. When it comes to confirming a hypothesis it's randomized, controlled trials or GTFO.
>This is because there is probably a link between eating a lot of red and processed meat, and bowel (colorectal) cancer.
Probably? Link? "Link" as in "co-occurring but not necessarily causally related"? No references to the studies or methods? This is the same quality material that has been used to promote eating less fat for several decades, while obesity exploded to record levels.
This is the same quality material that has been used to promote eating less fat for several decades, while obesity exploded to record levels.
It’s a bit ironic you are questioning the study with regards to causality and then mention obesity exploding to record levels with an unsubstantiated, implied connection to the promotion of eating less fat. Your coment would have been perfect without the obesity comment since your comment is decrying linking things without sufficient causal evidence.
The person you're responding to is not saying that avoiding fat led to obesity. They're simply saying that the obesity epidemic is evidence against the hypothesis that consuming fats leads to becoming fat. Disconfirming a hypothesis is not the same thing as as proving a competing hypothesis.
Without further evidence/study one can’t conclude much. We have two facts:
1. There was a campaign against eating fats.
2. There was an obesity epidemic that started shortly after the campaign against eating fats.
Fact 2 is seductive because it’s natural to assume that 1) is the cause. But without further analysis and study one can’t conclude this. In a post decrying causal links without proper statistical evidence it’s ironic to fall victim to this seduction without proper statistical evidence.
There was an intervention (telling people not to eat fat) based on the "eating fats = bad" hypothesis and that intervention failed to have the effect that the hypothesis would predict. That is disconfirming evidence. You are correct that it's not conclusive, but it is disconfirming nonetheless.
It’s not disconfirming since it’s possible that eating fats is bad but also that replacing fats with sugar is bad too or possibly even worse than eating fats. There are also other possibilities to explain why eating too many fats is bad even though when they were targeted for reduction health outcomes did not improve.
>The organisation said the evidence that red and processed meats are causes of bowel cancer is convincing. It advises that people eat no more than 500g of red meat a week (around 70g a day) and avoid processed meats.”
The previous recommendations (that got retracted) had "convincing" evidence at their time too.
It never was. There were afaik no studies ; basically the advice was given on the chain “dietary cholesterol -> serum cholesterol -> blocked arteries -> cardiac problems” of which the last was convincing, the middle mostly convincing at the time, though much less so now - and the first was just assumed but is now known to be essentially wrong.
Which would make it wrong, but at the time, a non-expert who was looking for advice from people like doctors or their government would hear that they should be cutting back on eating eggs. Which would count as legitimately "convincing", if you're not a researcher in the field. Just wrong.
I used to be convinced that I knew more about nutrition science than doctors do.
Then I found that my total cholesterol was v high (7.5 mmol/L). I got scared and followed my doctor's dietary advice (less saturated fat, basically), and it dropped significantly (to 5.7, still higher than recommended levels)
That's when I realised that the likelihood that I can come up with some health strategy for myself that is better "following the advice of professionals" is extremely low
I followed doctors’ advice for my asthma blindly for years (since age 0, practically, without question) and basically felt horrible most of the time - was basically unable to finish a 2km run until age 20.
Then I woke up one day with the feeling that this can’t be right; started experimenting, and within a couple of months got to 5k, and within less than a year to 15k, feeling much better all the time and with every objective measurement improved.
That’s when I realized I should look into things and not just trust “advice of professionals” most of whom follow others quite blindly, and whose incentives are very different than mine (in my case, their incentive that guided me for 20 years is to minimize their legal risk - it was not to improve my health).
I agree it is important to know the bounds of knowledge - both yours and the professionals - and also to realize the incentives.
Also, google “John ioannadis”, an MD, on nutrition science and research in general. His seminal work about not trusting published research is unrefined 13 years later.
To a first approximation, nobody actually cares what scientists say. Scientists can publish press releases and hound journalists until they're blue in the face, but they're not getting on the news unless what they're saying fits into a pre-determined media narrative. They're little more than a useful prop, like the flashy "scientific" graphics in moisturizer commercials.
Fox News will wheel out anyone with a PhD who is willing to cast doubt on climate change, Vox will dredge up someone who'll say that NASCAR gives you cancer and we all end up less informed than grade-schoolers.
Scientists are largely powerless to "push back" against anything, because they don't control the institutions that disseminate information.
Some did. I worked in a lab for the woman whose papers were instrumental in getting the anti-egg recommendation reversed. There was a lot more money on the other side, unfortunately.
It's my understanding that disproving commonly accepted things is a way to become a superstar in the scientific field. Also, you don't need research grants to show that there is no research that shows a causal link between two things, which is what the poster above claimed was the problem with the anti-egg movement
Disproving stuff is a good way to "fan the flames" of your career but you have to have some reputation in the first place. Otherwise you're a quack amateur which doesn't work well with others, just like you'd be in any other field if you picked fights with the wrong people too soon.
It also turns out that validating people's preconceptions and getting grants is a great way to build credibility via "social proof".
The truly skilled PIs can develop a popular idea, make it accepted in the field, then dramatically disprove it once the "bad" researchers start piling on! These masters can usually be identified by massive grants, sending students places(careers) to build relationships/incur debt of favors over years, writing bestseller books (ghostwritten) etc. And these are the people whose ideas everyone repeats.
Studies of communities that have longitivtiy confirm this. For example, the "Ikaria" study, a university of Athen's study of the island of Ikaria, where males reach the age of 90 2.5 times more frequently than Americans, their diet consisted of hardly any red meat at all (and also hardly any sugar), but rather beans, legumes, wild local greens with lemon juice and olive oil, fish, honey, and red wine.
Was the study of this island controlled for climate, compared to a nearby island with different diets, or compared to a distant island with similar diets?
One of the points made by Ioannidis is that we’re far too easily swayed to believe what sounds plausible when no one has yet asked “Could this be for some other reason altogether that we aren’t testing for?” and then tests for those reasons. That flaw applies here: studying just one community automatically implies that the results are invalid, simply because there’s no evaluation of other communities that share various attributes that could be hidden factors for the benefit.
EDIT: Per one article, that study “would try to cut through the stories and establish the facts about Ikaria’s longevity”. It has succeeded in establishing that, factually, Ikaria’s longevity is a thing relative to other populations. It has not established which of the properties of Ikaria are responsible. (For example, perhaps the island is slightly radioactive, and old people thrive in a slightly higher radioactivity.) Extrapolating that their diet is responsible is an instance of the A/X error described in the above-linked article.
I think you could do worse than that diet, but this is a multivariate comparison. From the study abstract[0]
> Conclusion. Modifiable risk factors, such as physical activity, diet, smoking cessation and mid-day naps, might depict the “secrets” of the long-livers; these findings suggest that the interaction of environmental, behavioral together with clinical characteristics may determine longevity. This concept must be further explored in order to understand how these factors relate and which are the most important in shaping prolonged life.
It could be that the quantity red meat is solely responsible for their longevity, or has nothing to do with it, or plays some role with the many other confounding variables.
The study you cite did not confirm the red meat hypothesis. It merely failed to disconfirm it. Those are two very different things, which nutrition researches and the general public seem to be incapable of wrapping their heads around.
I do believe that many of the findings of nutrition science are half baked, provably contradictory, and ignore a plethora of factors that play a role in the interaction between food and health.
But there's another bazooka in the room: the contemporary erosion of trust in authority, in this case scientific authority. Ionannidis' remarks are for the best, I think, but will add to the confusion of not only what knowledge can the public trust but who is more apt to produce that knowledge.
Before things get better, this will inevitably widen the door not only for old bad science in new clothes, or in the form of professional, corporation funded research that resorts to p-hacking and such methods, but also to the further discredit of science in general in the general population.
Having said this, I do not believe Ioannidis' work (or Ben Goldacre's for that matter) is a tool of such agents.
Edit: After reading some comments in this thread, I have to say in all honesty that I suspect that the ACSH's interest in Ioannidis work is disingenuous, without putting into question the validity of his work.
> But there's another bazooka in the room: the contemporary erosion of trust in authority, in this case scientific authority. Ionannidis' remarks are for the best, I think, but will add to the confusion of not only what knowledge can the public trust but who is more apt to produce that knowledge.
The erosion of trust is _because_ of the poor quality of scientific findings produced by Science the institution (particularly in select fields).
This is the only sustainable path forward for Science the institution. The only alternative is a period of suppression of criticism, inevitably contemporaneous with increasing levels of corruption, followed by massive denunciations and outrage once the dam breaks.
Didn't really work well for the Catholic church. I can't see how it would be any better for Science.
> The erosion of trust is _because_ of the poor quality of scientific findings produced by Science the institution (particularly in select fields).
So it would seem, if we considered Nutrition Science alone. But that does not explain the process of increasing distrust in vaccination, global warming and climate change, or even evolution, where the scientific practice has a quite solid record and community consensus.
The erosion of authority reaches outside of Science, which to me suggests that there might be a broader explanation beyond the specificities of each field. In my opinion, and like you say, organized religion was one of the first victims of this process in the past and today.
> The erosion of authority reaches outside of Science, which to me suggests that there might be a broader explanation beyond the specificities of each field.
In my opinion, the difference is the low cost of broadcasting ideas. Before, if you didn't believe in a main stream idea, you could share your theories with people near you. Now, it's easy for people who don't believe in the main stream to form communities and reinforce each other's beliefs.
That is certainly a part of the problem. It amplifies the apparent relevance of erroneous perspectives and furthers the idea that expertise is at hand's reach for laypeople.
In 1979, the information director of the FDA said, "Whelan just makes blanket endorsements of food additives. Her organization is a sham, an industry front."[31] In 1980, ACSH co-founder Frederick J. Stare was chairman of ACSH's Board of Directors and sought funding from US tobacco company Philip Morris USA for ACSH's activities, stating that he believed financially supporting ACSH would be to Phillip Morris' benefit.[26][27] In the early 1990s, ACSH decided to stop reporting its funding.[32] Their 1991 report shows that many corporations contributed funds.[32] In 1982, the Center for Science in the Public Interest (CSPI), a consumer advocacy group, published a report on ACSH's practices that stated, "ACSH seems to arrive at conclusions before conducting studies. Through voodoo or alchemy, bodies of scientific knowledge are transmogrified into industry-oriented position statements."[33] CSPI director Michael F. Jacobson said of ACSH, '"This organization promotes confusion among consumers about what is safe and what isn't.... ACSH is using a slick scientific veneer to obscure and deny truths that virtually everyone else agrees with."[34]
“Science is a noble endeavor, but it’s also a low-yield endeavor,” he says. “I’m not sure that more than a very small percentage of medical research is ever likely to lead to major improvements in clinical outcomes and quality of life. We should be very comfortable with that fact.”
https://www.theatlantic.com/magazine/archive/2010/11/lies-da...