Detecting Pseudoscience (bad science) in Published Papers: Case Study #2

In a previous post titled Glyphosate Pseudoscience, I investigated a journal article in which the authors claimed that indirect consumption of the herbicide glyphosate leads to toxicity and disease in humans. I showed how a perfectly reasonable claim that is worthy of investigating can completely lose credibility when the authors are not experts in the field (computer scientists writing a biochemistry article), do none of their own experiments, are motivated by bias (they are anti GMO), and claim causation with spurious correlations.

The article in question in this post is a published study on diet sent to me by a former student. The article is not deceitful pseudoscience like the previous one, but it still qualifies as pseudoscience simply because it is bad science. The study is titled, “The relationship between peripheral blood mononuclear cells, telomere length, and diet – unexpected effect of red meat,” and appeared in the Nutrition Journal from the publisher BioMed Central. Neither the journal nor the open access publisher appears in Jeffrey Beall’s list of Predatory Journals and Publishers. So far, so good.

My former student sent the article perhaps because it contradicts something he hopes to be true about diet: eating lots of red meat reduces the lengths of telomeres (the non-coding tips of chromosomes that protect the chromosome’s genes from damage) and may thus shorten a person’s life span (maybe the young man is a vegetarian). Or maybe he sent me the article because he knew there was something methodologically wrong with it and wanted to see if I could pull it apart. He never said.

What I see in this article is a great example of how easy it is to do pseudoscience (bad science in this case). The article is exemplary of 1) why we need to continue to improve our Nature of Science (NOS) instruction in the classroom, and 2) why the science publication machine must clean up its act. I certainly hope I am good enough at teaching NOS that my students graduate with the ability to think skeptically about results like the ones presented in this paper, reject those results, push for better science methodology, and help their peers and family members be better at detecting sketchy claims themselves.

Below are several reasons why I think this is a really bad paper.

First, the authors start with 28 subjects and put them into numerous iterations of six groupings (F0 – F5) based on their frequency of engaging in particular activities (e.g. smoking, eating fruit, drinking coffee).

Screen Shot 2017-06-17 at 8.57.05 PM

This is a massive reduction in sample size and statistical power per subgroup and we aren’t told how many subjects ended up in each group. We are also not told the reasoning behind how the researchers came up with these specific groupings other than what defines each group.

Second, the researchers are p-hacking. P-hacking is “when researchers collect or select data or statistical analyses until nonsignificant results become significant,” which “leads to substantial bias in the scientific literature.” In this case, the researchers used Analysis of Variance (ANOVA) to look for treatment effects caused by the frequency of engaging in the reported activities, and they found… one: the weekly frequency with which a person eats red meat.

Screen Shot 2017-06-17 at 6.04.36 PM
Table 2 from Telomere Length and Red Meat

The authors then performed what are called post hoc (after the fact) statistical tests to see which groups of red meat consumption diet frequency were significantly different (AKA statistically distinguishable) from other groups. In this case they used something called a Tukey Test because making a whole bunch of pairwise comparisons can result in finding at least one by accident where differences ended up significant. Again, the authors found one statistically different pairwise comparison that they show in Figure 1 in the paper.

Screen Shot 2017-06-17 at 9.02.09 PM
Figure 1 from Telomere Length and Red Meat: PBMC telomere length differences between red meat consumption groups. Data as mean with 95 % CI of T/S ratio, p-value of statistically significant post-hoc Tukey test, F0 – never, F1 – once weekly or less, F2 – once daily in 2–3 days of week, F3 – once daily in 4–6 days of week, F4 – 1–2x daily, F5 – 3–5x daily (no one in the study reported eating meat at the F5 level).

What the authors show here in Figure 1 and also argue in the paper is that there is a statistically significant difference in relative telomere length between the subgroup that reported no meat in their diets (F0) and the subgroup that reported consuming red meat during one meal a day on four to six days of the week (F3).

Yay! Meat eating is good for telomere length!

But not so fast. Using the same (flawed) logic, we can also conclude that compared to a meatless diet, if a person eats meat at least once a day (F4, 1-2 times daily; no one in the study reported eating meat at every meal: F5) there will be no statistically detectable effect on telomere length compared to any other level of meat consumption, including being a vegetarian.

Yay! Being a vegetarian is just as good for telomere length as eating lots of meat (except for the few people who ended up in the F3 subgroup)!

There are more pseudoscientific problems with this paper. There is no mention of effect size (effect size = the absolute difference between two groups divided by the mean standard deviation of the two groups) and no way to calculate it because we have no idea what the degrees of freedom (and thus the sample size) were for any of the sub groups. Effect size can help us determine how important a difference is, and in this case, if the lucky difference between F0 and F3 for meat consumption is clinically relevant. Indeed, how much of a difference in telomere length really matters to a person’s lifetime health? The authors never say, yet this is critical clinical and health information.

The authors then reveal with surprising clarity their own motivated reasoning and confirmation bias. In one place in the Discussion, they say: “Our study on a small group of people managed to demonstrate the relationship between the frequency of consumption of red meat and telomere length.” However, even though the authors suggest more research is needed, they make no mention that their own “unexpected” result is suspect because the statistical power is so low. But later they admit: “The study did not confirm negative effect of smoking on telomere length. This finding is probably associated with insufficient sample size.”

What’s really sad about pseudoscientific studies like this is that while it may be true that a diet frequent in red meat consumption can somehow produce meaningful effects on telomere length and in turn affect a person’s overall health and longevity, the media frequently take what they want from these should-never-have-been-published studies and make claims like:

“New research on diet shows that eating meat four to six times a week can increase your lifespan. How do we know? Tune in at ten and find out!”


“New research on diet shows that vegetarians may not live any longer than people who eat meat every day. How do we know? Tune in at ten and find out!”

We must do better.

Screen Shot 2017-06-17 at 9.17.07 PM

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s