Well, it’s not real science, but it teaches us some important lessons about how research is conducted.
In the annual lighthearted Christmas issue of the British Medical Journal (BMJ), a team of researchers led by Cardiologist Robert Yeh from Harvard Medical School published a piece on the effectiveness of parachutes in preventing death or major injury when jumping from an aircraft.
Of course, there’s obviously no way to conduct an experiment on the effectiveness of parachutes using the gold standard of scientific/medical research – the randomized controlled trial. This is a type of study that aims to reduce bias in research by randomly assigning either a treatment or a placebo to two groups and comparing the results.
A randomized controlled trial of parachutes would require some people to jump out of a plane with a fake parachute.
Yet, Yeh and his team attempted to put together a study, seeking volunteers from their seatmates on airplanes:
“We’d strike up a conversation and say, ‘Would you be willing to be randomized in a study where you had a 50 percent chance of jumping out of this airplane with — versus without — a parachute?'”
Apparently, a few people agreed but were immediately disqualified because of potential mental health issues.
In the end, the researchers did perform an experiment, choosing 23 people from their own universities to be randomly assigned either a backpack or a parachute and jump from a plane on Martha’s Vineyard or from a helicopter in Michigan.
Here’s a rather silly figure they produced showing how it all went down:
Don’t worry – your tax dollars didn’t pay for this.
So, we have scientists performing a bizarre experiment where they already know the outcome, a set of participants who were handpicked by the researchers, and two study sites and two different kinds of aircraft that could easily void the results. But wait, there’s more.
The participants did, in fact, jump out of a biplane or helicopter with their gear, but it was parked on the ground. The drop was about 2 feet.
Because no parachutes needed to be deployed and no one was hurt, technically, the study (as it was set up) proved that parachutes are no better than backpacks when jumping out of an aircraft.
Ridiculous, right? Yeh agrees:
“But, of course, that is a ludicrous result. The real answer is that that trial did not show a benefit because of the types of patients who were enrolled.”
Well, and the fact that jumping out of a small aircraft sitting on the ground entails zero risk of catastrophic parachute failure.
So why even waste the time to conduct and write up the study? To make a point about equally skewed research with much higher stakes.
Yeh and his colleagues claim that scientists, especially in the medical field, often cherry-pick patients and circumstances to achieve the results they want to see. This leads to equally unusable research – except that unlike Yeh’s study, we do rely on these other shoddy studies.
Scientists and doctors too often just read the conclusion of a scientific study without taking a close look at the methodology. This is a huge problem since it might look like a drug or device will be effective after a study, but upon closer inspection, the experiment was tainted in some way – for example, perhaps the sample size was too small or people were chosen because they were unlikely to show side effects.
“It’s a little bit of a parable, to say we have to look at the fine print, we have to understand the context in which research is designed and conducted to really properly interpret the results,” Yeh says.
The article “Parachute use to prevent death and major trauma when jumping from aircraft: randomized controlled trial” contains some amusing footnotes as well, including the note that “All authors suffered substantial abdominal discomfort from laughter.”
And while you might be rolling your eyes at the silliness of it all, many scientists are already planning to use it in their research courses to give students a lighthearted way of learning about the importance of looking at all of the information involved before accepting the results of a study.
If you want a good laugh, you can check out other studies from the special Christmas issue of the journal, including “Golf habits among physicians and surgeons” and “Is it time to start using the emoji in biomedical literature?”
In the meantime, don’t believe everything you read, even if it’s in a scientific journal.
Please SHARE this with your friends and family.
Follow your friends or be the first to join our group