The headline blares, "Omega-3 Supplements Don't Lower Heart Disease Risk After All." The headline is directly and flagrantly lying.
What if you studied patients who took omega-3 fish oil pills and the actual data showed that the rates of all-cause mortality, cardiac death, sudden death, myocardial infarction, and stroke were identical between the patients who took the supplements and those who didn't? Well, then you could say that the fish oil pills did not help. That's what the headline above implies. But the reality is surprisingly different.
In an article published in the Journal of the American Medical Association on 12 September 2012, a meta analysis of 68,680 patients showed that patients taking omega-3 fish oil pills had a 4% lower all-cause mortality rate, 9% lower cardiac death rate, 13% lower sudden death rate, 11% lower myocardial infarction rate, and a 5% higher rate of stroke.
The headline says fish oil pills don't lower heart disease risk after all. But these data show that they do. However, if you say that the results must be statistically significant to the 0.63 percent level--in other words, there must be less than a 0.63 percent chance that the results are due to luck--as this study did, then the results do fall short. However, the headline should be changed to, "Omega-3 Supplements Lower Heart Disease Risk But Results Not Strong Enough To Be Convincing."
Statisticians have misled people with statistical significance. There are really two results. The numerical results (fish oil pills lower risk) and the chance the results might be wrong (the results aren't convincing). To say that fish oil pills don't work because the statisticians aren't convinced is to butcher both the English language and logic. It's similar to someone saying that the Ferrari 458 Italia isn't any faster than the Toyota Prius because they simply don't believe any company marketing materials. Perhaps he or she is right and marketing materials are misleading, but don't say what isn't true. Don't say the two cars are equally fast. Instead, say that company marketing materials are unreliable. Don't confuse an assessment of the quality of the data with the data itself.
Statisticians want us to act as if we are in a court of law and we want to know whether the results have been objectively proven (they are statistically significant) or disproved (not significant). It's not that straightforward. There are two main weaknesses of statistical significance: the threshold used to show significance and the sample size. Both the threshold and the sample size are arbitrary and they both directly affect the results. One arbitrary number was selected by statisticians and the other just happened to be the number of people who participated in the studies. If a higher statistical significance threshold had been selected (just as arbitrary as the other one) or if more patients had been studied, then the results might have passed the test and the headline would have roared, "Omega-3 Supplements Lower Heart Disease Risk."
Why do so many people uphold statistical significance as a measure of objective, rational truth when it is based on two arbitrary quantities? Why do so many people confuse their assessment of the quality of the data with the data itself?