Writing / Irene Vega
A team of more than 300 researchers has published A study in the journal BMC Biology, highlighting the need to address analytical variability in research results. The results obtained align with the growing recognition that many decisions that researchers must make – such as which statistical methods to apply – can lead to divergent conclusions even when the different options are all reasonable. This substantial variability has important implications for ecologists and other scientists who analyse the data.
To conduct the research, 174 teams of analysts were formed to work with research questions predefined by the study directors and analyse the same data sets, finding remarkably variable answers to the questions posed. These results highlight the diversity in analytical decision-making and shed light on possible sources of unreliability and bias in scientific processes.
“This ambitious study demonstrates in detail the potential that decisions made during the data analysis process have to influence the statistical results obtained in ecology and evolutionary biology,” underlines Ana Isabel García-Cervigón, a researcher at the URJC and co-author of the study. “Furthermore, it shows that this is a general concern in the scientific field that goes far beyond the field of social sciences, where until now most of the work on this topic has been carried out.”
The paper describes several data analysis practices that researchers could adopt in response to this variability. For example, researchers could run several different analyses of the same data to assess the similarity of results, through statistical models. They could also conduct more ambitious ‘multiverse’ analyses in which they generate many hundreds or thousands of analyses to explore how different choices influence results. These different options join an ecosystem of other proposals to promote the trustworthiness of scientific research, many of which focus on improving transparency.
The authors hope that their findings will encourage researchers, institutions, funding agencies and journals to support initiatives aimed at improving the rigor of research, ultimately strengthening the reliability of scientific knowledge.
The work was co-led by Tim Parker, professor of biology and environmental studies at Whitman College in Walla Walla, Washington, with co-lead authors Elliot Gould, a PhD student in the School of Biosciences at the University of Melbourne, Hannah Fraser, a postdoctoral researcher at the University of Melbourne, and Shinchi Nakagawa, professor and Canada Excellence Research Chair at the University of Alberta.