Large numbers of biomedical scientists have tried and failed to replicate their own studies, with many not publishing their findings, a survey suggests.
Authors of the study warn that researchers' failure to approach their own work rigorously creates "major issues in bias" and hampers innovation in science.
Their survey, of about 1,600 authors of biomedical science papers, found that 72 percent agreed there was a reproducibility crisis in their field.
Participants suggested a variety of factors, but the leading cause that most participants indicated always contributes to irreproducible research was the pressure to publish.
The study found that just half (54 percent) of participants had tried to replicate their own work previously. Of those, 43 percent failed.
Of those who had tried to replicate one of their own studies, just over a third (36 percent) said they had published the results, according to findings published in PLOS Biology on Nov. 5.
Lead author Kelly Cobey, associate professor in the School of Epidemiology and Public Health at the University of Ottawa, said respondents felt that their institution did not value replication research to the same extent as novel research.
"Until we give researchers the time, funding and space to approach their research rigorously, which includes acknowledgment for replication studies and null results as valuable components of the scientific system, we are likely to only see select reports of the scientific system being published," she told Times Higher Education.
"This creates major issues in bias and hampers our ability to innovate and discover new things."
Cobey said publications remained an "important though problematic currency of a researcher's success," because there is a perception that null findings are not as interesting as positive ones.
"Researchers may feel that there is limited value in writing up their results ... if they are not likely to be accepted in a peer-reviewed journal, particularly a prestigious one."
Many researchers reported that they had never tried to replicate someone else's study. Of the participants who had tried to reproduce findings by another team, more than 80 percent had failed to get the same results.
Cobey called for a much more rigorous system of monitoring research reproducibility and researcher perceptions of the academic ecosystem conducted at a national level.
"I think it is clear that issues with academic incentives continue to pervade the scientific system and that we need significant advocacy and reform if we are going to align our research conduct with best practices for transparency and reproducibility," she said.