Skip to main content
MJFF Feed

Laying a Sound Foundation for Replicable Research

Laying a Sound Foundation for Replicable Research

Theoretically, if a study’s outcome is true, the same set-up would lead to the same results. A recent article in The Economist revealed the reality—that attempts to reproduce landmark scientific findings have failed to do so in most cases.

“If a systematic campaign of replication does not lead to the same results, then either the original research is flawed (as the replicators claim) or the replications are (as many of the original researchers ... contend). Either way, something is awry,” wrote the magazine.

It can be difficult to reproduce results for many reasons. Most papers don’t include a list of the laboratory tools used, and researchers are less likely today than even five years ago to share their raw data. And if subsequent tries don’t find the same results, it does not mean that the original results were intentionally flawed; the pressure to publish and the unstable foundation of some projects leads to false positives.

Whatever the reason, research published with incorrect findings leads to waste that costs time and money away from valuable projects that could lead to improved therapeutics for Parkinson’s and other diseases. For example, pharmaceutical companies may direct attention to developing drugs for a new target that turns out not to be the key to a cure.

The Michael J. Fox Foundation (MJFF) has created infrastructure and best practices to reduce such waste and encourage sound, reproducible findings. Still, more needs to be done.

Setting Studies Up for Replication

The Parkinson’s Progression Markers Initiative (PPMI)—a MJFF-sponsored study to identify and validate biomarkers of Parkinson’s disease—makes all data available in real-time. Investigators, including those not involved in PPMI, can mine clinical, biological and imaging data from more than 600 study participants. This open-access model promotes replication and original research on this population to identify and validate new findings quicker, leading to better therapies faster.

The rich volunteer profiles and large, geographically diverse cohorts in PPMI and other MJFF-sponsored studies, such as the LRRK2 Cohort Consortium, also boost outcome replication. The more researchers know about each subject and the larger the number of subjects there are, the more confident their results can be. 

Furthermore, investigators can stratify patients to different therapies based on certain characteristics—e.g. clinical presentation or protein levels. Different subtypes may respond to different therapies, and choosing those more likely to have a favorable outcome can increase the significance—and reproducibility—of studies. This is one of the reasons MJFF is working so hard to find biomarkers of Parkinson’s disease: to be able to stratify patients for trials.

Providing the Right Tools

Some scientific journals are requiring more information from scientists submitting papers, including details on the laboratory tools used (though they are published in an online supplement). Even with that information, replication may be difficult because pre-clinical tools are hard to recreate precisely and hard to access from those who have them already. Like in the kitchen, if these ingredients of research are off, the end result will be, too. 

The MJFF Tools Program exists to create, validate and distribute pre-clinical tools (antibodies, cell lines, etc.) so that different studies are working with the same ingredients. By making tools needed across the field available at little to no cost, the Foundation levels the playing field and supports replication.

Providing Feedback Along the Way

Too often, too, researchers work in siloes—holing up, crunching numbers and submitting their work for publication without talking about their analysis along the way. MJFF holds regular assessment calls and meetings with grantees to discuss progress, serve as advisors and sounding boards, and offer perspective and resources that can help them meet incremental milestones.  This real-time feedback helps resolve complications or express concerns quicker and allows the investigator to redirect if an unexpected factor arises, thereby reducing time wasted going down the wrong road.

“There are errors in a lot more of the scientific papers being published, written about and acted on than anyone would normally suppose, or like to think,” wrote The Economist. Around MJFF we say often, “This is why our Foundation exists…” We’re trying to discourage such a mismatch in Parkinson’s publications because, in the end, no one benefits from false findings. Least of all patients.

We use cookies to ensure that you get the best experience. By continuing to use this website, you indicate that you have read our Terms of Service and Privacy Policy.