When we look for reliable sources of information, we turn to studies published in peer-reviewed scientific journals. But in some cases, researchers find it difficult to reproduce the results of certain studies, and often, their findings turn out to be different from the original ones–even when the same methods and procedures are used–thereby making the study unreliable. This discrepancy is called a “reproducibility crisis”–or the inability of scientific findings to be replicated by other researchers. This problem has become more prevalent over the past few decades, and according to existing evidence, it affects up to a quarter of studies in cancer research and over a third of studies in psychology. Naturally, it has attracted the attention of scientists globally, who have proposed various explanations for the reproducibility crisis–including the selective publication of positive results, poor statistical practices, and forming hypotheses only after results are known. But, scientists often tend to avoid suggesting research misconduct as a cause, possibly to avoid controversies. In an editorial published in
Molecular Brain
, Prof Tsuyoshi Miyakawa, one of the Editors-in-chief, shows how this inhibition might further aggravate the issue. He goes on to explain how many authors fail to provide raw data upon request and speculates that this may be because the requested data never actually existed.
Prof Miyakawa based his analyses on manuscripts that were submitted to the peer-reviewed journal
Molecular Brain
, for which he has served as an Editor-in-Chief since 2017. “As an Editor-in-Chief of the journal, it is sometimes difficult to believe the results of manuscripts that are ‘too beautiful to be true’.” In 41 such cases, Prof Miyakawa asked the manuscripts’ authors to provide the raw data supporting their conclusions. Surprisingly, in more than 97% of cases, the authors either withdrew their manuscripts without providing any raw data or provided incomplete raw data (many of which did not match the results of their studies). These issues resulted in their manuscripts being rejected. In only one case did the authors provide the complete raw data, and that paper was subsequently reviewed and accepted for publication. Thus, most of the authors were either unable or unwilling to provide raw data to support their conclusions.
Prof Miyakawa also noted that, of the 40 manuscripts that were withdrawn or rejected, 14 subsequently appeared in other journals. In 12 cases, the publishing journals had policies requiring or encouraging the authors to make their raw data available upon request from a reader. He sent requests for raw data to the authors of those 12 papers but did not receive a response in 10 cases. In another case, the authors refused to provide their data, and in the remaining case, the authors provided him with an incomplete set of raw data.
In reflecting on these experiences, Dr Miyakawa surmises that at least some of the failures to provide raw data was because the data did not exist from the beginning. He acknowledges that some cases may have other explanations, such as “honest” mistakes or an unwillingness to share raw data prior to completing planned future analyses, but he believes that such explanations are not adequate. He even notes that his suspicions of research misconduct may cause a stir within the world of science. He muses, “Under the current publication system, the field of life sciences is like a ‘house built on sand’, and thus it is important to dig deeper to get to the root of the issue.”
Lastly, to address the widespread problem of fabricated data, Dr Miyakawa argues that journals should require, as a condition of publication, the deposition of raw data in publicly available databases or on journal websites. He says, “Such policies may be difficult and costly to adhere to, but once implemented, they will greatly improve the credibility of scientific studies in general.” Praising Dr Miyakawa’s editorial, Dr Min Cho, Editor-in-Chief of
Neuroscience Next
and former Senior Editor of
Nature Neuroscience
, says, “I’ve read with great interest Dr Miyakawa’s editorial in
Molecular Brain
. Because the piece provides an analysis of real-world submissions, we get a rare glimpse into the inner workings of a scientific journal. Promoting data transparency by being editorially transparent about its submissions, this journal’s editorial here is a reality check for the scientific honor system.”
Dr Miyakawa concludes by calling on research institutions, funding agencies, and science publishers to develop policies and practices to implement a publishing system based on a “no raw data, no science” outlook.
###
Reference
Title of original paper: No raw data, no science: Another possible source of the reproducibility crisis
Journal:
Molecular Brain
DOI: 10.1186/s13041-020-0552-2
About Professor Tsuyoshi Miyakawa
Prof Tsuyoshi Miyakawa, PhD, is one of the Editors-in-Chief of
Molecular Brain
. He also serves as the Editor-in-Chief for Neuropsychopharmacology Reports and a Deputy Director of General Affairs of the Japan Neuroscience Society. Apart from this, he is the director of the Division of Systems Medical Science at Fujita Health University in Toyoake, Japan. With more than 130 research publications to his credit, he has been studying relationships between genes, the brain, and behavior since 1993.
Media contact: Prof Tsuyoshi Miyakawa
Email:
[email protected]
This part of information is sourced from https://www.eurekalert.org/pub_releases/2020-02/fhu-ha021020.php