- Commentary
- Open access
- Published:
A network of change: united action on research integrity
BMC Research Notes volume 15, Article number: 141 (2022)
Abstract
The last decade has seen renewed concern within the scientific community over the reproducibility and transparency of research findings. This paper outlines some of the various responsibilities of stakeholders in addressing the systemic issues that contribute to this concern. In particular, this paper asserts that a united, joined-up approach is needed, in which all stakeholders, including researchers, universities, funders, publishers, and governments, work together to set standards of research integrity and engender scientific progress and innovation. Using two developments as examples: the adoption of Registered Reports as a discrete initiative, and the use of open data as an ongoing norm change, we discuss the importance of collaboration across stakeholders.
Introduction
Evidence of a number of problematic practices and norms across the research cycle give us good reason to doubt the credibility of much research [12, 15]. This, coupled with mostly unsuccessful attempts to replicate core research findings in psychology [18] and elsewhere [5], exemplifies the far-reaching issues of research integrity that the scientific community currently face. Researchers prioritising research transparency, quality, and culture have driven changes in research norms across the world, with open science/scholarship initiatives playing a central role in developing and championing new approaches and standards.
Whilst the scale of change achieved in the last decade is notable, a central barrier to sustainable change in integrity norms is the extent to which all research stakeholders collaborate to embed and progress such developments [19]. Here, we summarise two developments, open data and Registered Reports, which can tackle this wider crisis of science through increased transparency, research quality, and changes to research culture. We discuss how the research community needs to collectively tackle such issues, acknowledging how action from one stakeholder can alter demands and value for other stakeholders, thus requiring coordinated action.
Main text
Open Data
One driver of the current crisis is a lack of transparency—a lack of open sharing of data and materials. As observed during the COVID-19 pandemic, making data openly accessible is transformative for scientific and public understanding, providing accountability within psychological research [1]. Unfortunately, sharing data has been uncommon historically, and when materials and data are not shared, researchers, funders, and journals cannot adequately assess the robustness of published work, slowing scientific progress. Openness is also an important facilitator of reproducibility, as researchers often struggle to reproduce analyses or conclusions without access to associated datasets (e.g., Wicherts et al., 2016).
Inaccessibility of data, and thus low transparency, makes attempts to progressively build upon previous research inefficient for funding and researcher hours. It is harder to replicate and establish the boundaries of effects and to evaluate the quality of work. It can also hinder error detection and correction, and the identification of fraud (e.g., [22]. Therefore, research transparency can have multifaceted direct and indirect consequences on the quality and speed of research developments, and should be a priority for stakeholders.
Advocating for transparency in research requires a cultural shift and a fundamental realignment of expectations. Currently, scientific norms encourage researchers to state that data is available “upon reasonable request”, but subsequent rates of data sharing by request are unacceptably low [13],Wicherts et al., 2016; [6]. A priority for the scientific community should be ensuring that data are safely preserved, conform to the FAIR principles (Findable, Accessible, Interoperable, Reusable [23], and are openly available for re-use and re-analysis where possible. Table 1 explores the interconnected demands placed upon all stakeholders of research regarding open data.
Researchers that are willing to share their data face challenges in resourcing and knowing how to do so ethically whilst conforming to FAIR principles [23]. To facilitate data sharing, co-ordinated change is needed across stakeholders. For example, changes to journal data availability statement policies can facilitate sharing practices (e.g., [10], but this increases demands upon training, support and infrastructure of consequence to researchers, research support (e.g., libraries, technicians), universities, and funders [11]. Table 2 considers the various responsibilities each research stakeholder have towards co-ordinated reform of standards.
Registered Reports
Research quality is a vital component of research integrity. We cannot promote better integrity of research if we do not first consider how the quality (i.e., robustness, reliability, and validity) can be improved. One barrier to research quality actively propagated by many publishers and journals is ‘publication bias’, whereby null/non-significant results are much less likely to be published than statistically significant findings. This incentivises questionable practices such as p-hacking data to ‘find’ a significant result, or selectively reporting significant results [2, 8]. This directly contributes to the crisis because it makes publication contingent upon the results of the work, rather than the theoretical significance and methodological rigour of the research.
Concerned by publication bias, researchers have developed several initiatives to improve research practices and standards in methodology and publishing. Deviating from the traditional publication route where papers are peer-reviewed following study completion, Registered Reports (RRs) are one such innovation in publication. At Stage 1, the introduction, hypotheses/research questions, methods, and analyses undergo peer-review before data collection. This feedback can identify flaws in the protocol and allows substantive changes to be made before using resources (e.g., funding, participant time). Work receives in-principle acceptance from the journal, whereby the subsequent completed (Stage 2) report will be published regardless of the findings, if the authors have collected and reported data according to Stage 1 [3]. RRs reduce publication bias because acceptance is based on the importance of the research question and methodological rigour, rather than the results. This reduces pressure to produce significant results and counters the incentives that drive selective reporting and other questionable research practices [4]. RRs are valuable amid ongoing concerns of widespread ‘false-positive findings’ in the published literature, as hypotheses are supported much less frequently among RRs than conventional research articles [21], providing initial evidence for the value of the approach (Fig. 1).
Further structural support is needed in order to implement RRs more widely, including training, funding, and wider journal adoption. See Tables 1 and 2 outlining the interconnected roles and responsibilities of research stakeholders for RRs. Registered Report Funding Partnerships have been proposed as a method of extending the RR model by integrating it with the grant funding process, such that researchers receive both funding and in-principle acceptance for publication based on the integrity of the theory and methods. Combining funding and publication decisions may streamline processes and reduce the burden on reviewers, while also providing the aforementioned benefits of RRs in reducing questionable research practices and publication bias [14]. Such RR-funding partnerships, and similar innovations for drug marketing authorisation [16], offer important and innovative examples of how stakeholders and processes can be unified to improve standards for research quality.
Outlook
Overcoming the issues underlying the current crisis requires united action across research stakeholders. For example, individuals may wish to conduct RRs, but journals must offer this option and funders must value and incentivise such work. Similarly, journals can mandate open data sharing, but researchers require training, support and infrastructure to facilitate this. Initiatives designed to improve research integrity should be mapped out with consideration to the different demands and value provided to each of the different stakeholder groups. This allows obstacles to be anticipated and encourages co-ordinated action, increasing the likelihood of such initiatives becoming sustainable.
Acknowledging our priorities of transparency, rigour and culture, open data and RRs represent only two initiatives which require more collective action. While we focused here on open data, transparency could also be prioritised by promoting open sharing of research materials, which rely on the same mechanisms. Similarly, we focused on RRs as one method to alleviate publication bias, but other initiatives, such as open peer review and crowd-sourced open review, also represent promising avenues to improve research integrity. Thus, the priorities and ideas here should be viewed as a starting point for a wider, more comprehensive consideration of how the transparency, quality, and culture of research, and thus integrity, can be improved together.
Availability of data and materials
Not applicable.
Abbreviations
- DORA:
-
Declaration on Research Assessment
- FAIR:
-
Findable, Accessible, Interoperable, Reusable
- RRs:
-
Registered Reports
References
Besançon L, Peiffer-Smadja N, Segalas C, Jiang H, Masuzzo P, Smout C, Leyrat C. Open science saves lives: lessons from the COVID-19 pandemic. BMC Med Res Methodol. 2021;21(1):1–18.
Bruton SV, Brown M, Sacco DF. Ethical consistency and experience: an attempt to influence researcher attitudes toward questionable research practices through reading prompts. J Empir Res Hum Res Ethics. 2020;15(3):216–26.
Chambers CD. Registered reports: a new publishing initiative at Cortex. Cortex. 2013;49(3):609–10.
Chambers CD, Tzavella L. The past, present and future of registered reports. Nat Hum Behav. 2022;6:29–42.
Errington TM, Mathur M, Soderberg CK, Denis A, Perfito N, Iorns E, Nosek BA. Investigating the replicability of preclinical cancer biology. eLife. 2021;10:e71601.
Evans T. Developments in open data norms. J Open Psychology Data. 2022;10(1):3.
Evans TR, Pownall M, Collins E, Henderson E L, Pickering JS, O'Mahony A, Dumbalska T. A network of change: three priorities requiring united action on research integrity. UK Parliament, UK. 2021. https://psyarxiv.com/r6gpj.
Fanelli D. Negative results are disappearing from most disciplines and countries. Scientometrics. 2012;90(3):891–904.
Fang F, Casadevall A. Retracted science and the retraction index. Infect Immun. 2011;79(10):3855–9.
Hardwicke TE, Mathur MB, MacDonald K, Nilsonne G, Banks GC, Kidwell MC, Frank MC. Data availability, reusability, and analytic reproducibility: Evaluating the impact of a mandatory open data policy at the journal cognition. Royal Society Open Sci. 2018;5(8):180448.
Houtkoop BL, Chambers C, Macleod M, Bishop DV, Nichols TE, Wagenmakers EJ. Data sharing in psychology: a survey on barriers and preconditions. Adv Methods Pract Psychol Sci. 2018;1(1):70–85.
Ioannidis JP. Why most published research findings are false. PLoS Medicine. 2005;2(8):e124.
Magee AF, May MR, Moore BR. The dawn of open access to phylogenetic data. PLoS ONE. 2014;9(10):e110268.
Munafò MR. Improving the efficiency of grant and journal peer review: registered reports funding. Nicotine Tob Res. 2017;19(7):773–773.
Munafò MR, Nosek BA, Bishop DV, Button KS, Chambers CD, Percie du Sert N, Ioannidis J. A manifesto for reproducible science. Nat Hum Behav. 2017;1(1):1–9.
Naudet F, Siebert M, Boussageon R, Cristea IA, Turner EH. An open science pathway for drug marketing authorization—Registered drug approval. PLoS Medicine. 2021;18(8):e1003726.
Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJ, Yarkoni T. Promoting an open research culture. Science. 2015;348(6242):1422–5.
Open Science Collaboration. Estimating the reproducibility of psychological science. Science. 2015. https://doi.org/10.1126/science.aac4716.
Robson SG, Baum MA, Beaudry JL, Beitner J, Brohmer H, Chin JM, Thomas A. Promoting open science: a holistic approach to changing behaviour. Collabra Psychology. 2021;7(1):30137.
Rouder JN. The what, why, and how of born-open data. Behav Res Methods. 2016;48(3):1062–9.
Scheel AM, Schijen MR, Lakens D. An excess of positive results: comparing the standard psychology literature with registered reports. Adv Methods Pract Psychol Sci. 2021;4(2):25152459211007468.
Simonsohn U. Just post it: the lesson from two cases of fabricated data detected by statistics alone. Psychol Sci. 2013;24(10):1875–88.
Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M, Baak A, Mons B. The FAIR guiding principles for scientific data management and stewardship. Sci Data. 2016;3(1):1–9.
Acknowledgements
Not applicable.
Funding
The authors have no funding specific to this work to declare.
Author information
Authors and Affiliations
Contributions
TRE was responsible for conceptualization, project administration, funding, writing (original draft) and writing (review & editing). MVP, EC, ELH, JSP, AO and MZ were responsible for conceptualization, writing (original draft) and writing (review & editing). MJ and TD were responsible for conceptualization and writing (review & editing). All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Competing interests
All authors have contributed to attempts to reform scientific practice. This includes through leadership, membership, roles, or collaboration within a number of groups including the Framework for Open and Reproducible Research Training (FORRT), Journal of Open Psychology Data, UK Reproducibility Network (UKRN), Registered Reports Steering Committee, and Society for the Improvement of Psychological Science (SIPS). A previous version of this work was submitted and published (RRE0007) as written evidence towards the UK Parliament’s Science and Technology Committee on Reproducibility and Research Integrity and was subsequently preprinted [7], https://psyarxiv.com/r6gpj).
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
About this article
Cite this article
Evans, T.R., Pownall, M., Collins, E. et al. A network of change: united action on research integrity. BMC Res Notes 15, 141 (2022). https://doi.org/10.1186/s13104-022-06026-y
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/s13104-022-06026-y