Earlier this year a study from the Institute Of Diet And Health claimed that eating chocolate helped you to lose weight. The problem was the Institute Of Diet And Health was fake, the study had only 15 participants and 18 different measurements were looked at. However, that didn’t stop the media from lapping it up, which was the whole point of the exercise to illustrate junk science.
Most journalists don’t know how to read a medical study. So to help them here’s a bare bones run down on how to do it.
One of the first things to look at are the authors of the study; how many, what are their credentials and are there any potential conflicts of interest? Closely related is how is the study funded? For example, if a drug company funded a study on a new drug they’re making the study will be likely biased towards a favorable outcome for that study.
Also please keep in mind that for FDA approval, which Canada tends to follow, a new drug must be shown to be not inferior to an existing drug. So if drug A is already on the market new drug B only needs to show that it is not worse than drug A. It doesn’t have to be better or have fewer side effects.
How many participants are there in the study and how many measurements are being looked at? A dead giveaway for the chocolate study was that there were only 15 participants and 18 different measurements were being looked at: too few participants and too many measurements. A study needs to have a reasonable amount of participants in order to be effective. There is no precise number below, which a study is ineffective, but a study should probably have at least 100 people. I’m a participant in a multi-year study that probably has tens of thousands of participants.
As to how many measurements a study looks at depends on how many participants. The fewer the participants the fewer the measurements. So a study with a 100 participants should probably look at no more than about three measurements. With a much larger study, like the Ontario Health Study, which will be looking at about 300,000 subjects over many years, you can look at a much larger number of measurements.
A study should define the background to what is being studied and describe the rationale for the design of the study. It should also provide details on who was included and who was excluded in the study and why, as well as the endpoints for the study (how long did it run).
Be very careful about the claims. One study I looked at said there was a 71% reduction in stent thrombosis when dual anti-platelet therapy was extended from the current 12 months to 30 months. Sounds very impressive until you look at the incidence. The incidence of stent thrombosis over 18 months is 1.4%, extending dual anti-platelet therapy lowers this to 0.4%.
Remember that one study does not necessarily establish a new treatment, diagnosis, prevention or other such thing. You need several studies showing essentially the same thing in order to be significant. When the link between smoking and cancer was first established, it certainly raised alarm bells, but more studies were needed to confirm it.
Abstracts do not always accurately reflect the summary of the study. Sometimes they make mistakes.
For further information on this subject see the YouTube video, Skeptical Journal Club: How To Read A Medical Study