Skip to main content

Promoting trust in research and researchers: How open science and research integrity are intertwined

Abstract

Proponents of open science often refer to issues pertaining to research integrity and vice versa. In this commentary, we argue that concepts such as responsible research practices, transparency, and open science are connected to one another, but that they each have a different focus. We argue that responsible research practices focus more on the rigorous conduct of research, transparency focuses predominantly on the complete reporting of research, and open science’s core focus is mostly about dissemination of research. Doing justice to these concepts requires action from researchers and research institutions to make research with integrity possible, easy, normative, and rewarding. For each of these levels from the Center for Open Science pyramid of behaviour change, we provide suggestions on what researchers and research institutions can do to promote a culture of research integrity. We close with a brief reflection on initiatives by other research communities and stakeholders and make a call to those working in the fields of research integrity and open science to pay closer attention to one other’s work.

Introduction

Highly publicised cases of research misconduct [1] have led to negative attention towards research integrity in which falsification, fabrication, and plagiarism are considered the three ‘cardinal sins’ of a researcher. Similarly, concerns about reproducibility [2,3,4]Footnote 1 triggered debates about the extent to which research is in an alleged crisis. At the same time, there is an increasing push to make research more open, which tends to carry with it more positive connotations. Open science has increasingly become a topic that is discussed and appreciated across different disciplines. Some funding agencies and scholarly journals recently started to mandate open science practices such as making research data public [5].

Despite these different connotations, proponents of Open Science refer to issues pertaining to research integrity and vice versa [6]. In this commentary, we show how some of the frequently used concepts (research integrity, responsible research practices, transparency, and open science) interrelate. The upshot of our commentary is that these concepts are all crucial to strengthen trust in research and researchers by making research more traceable and verifiable. We believe that their focus on a particular phase of the research process can be instrumental to further the understanding of these concepts, because it is precisely by virtue of their different foci that these concepts become complementary and mutually reinforcing. We illustrate this using an example of an imaginary research project, and by providing examples of situations where one concept is missing. We then elaborate on the different factors that influence research integrity and connect this to what research institutions can do to foster research integrity.

Main text

Intertwining concepts

In a nutshell, we believe that responsible research practices focus more on the rigorous conduct of research, transparency focuses predominantly on the complete reporting of research at every stage of the research lifecycle, and open science’s core focus is mostly on the dissemination of research, see Fig. 1. Having said that, we wish to emphasise at this point to the reader that we are not suggesting these concepts are mutually exclusive which implies that in a number of instances, the concepts can be, and in fact are, used interchangeably.

Fig. 1
figure 1

Intertwined concepts of responsible research practices, transparency, open science and their foci. This Venn diagram illustrates how the different concepts interrelate to make research more traceable and verifiable, with the aim to increase trust in research and researchers

Research integrity goes well beyond research misconduct (i.e., fabrication—making up of data that does not exist, falsification—manipulating data or results without justification, or plagiarism (FFP) [7]. It refers to the “principles and standards that have the purpose to ensure validity and trustworthiness of research” [8]. Research integrity focuses on the behaviour of individual researchers which is often grouped into three clusters: FFP, questionable research practices (QRPs), and responsible research practices (RRPs) [9]. FFP is obviously detrimental to trust in research, but we have some reason to believe it is relatively rare [10]. QRPs, which entail behaviours such as selectively reporting, p-hacking, or HARK-ing (hypothesising after results are known) are thought to be more common and to do collectively more damage than FFP [10, 11]. Although QRPs may be the result of sloppiness, engaging in these behaviours may also be intentional with the aim of having clean, clear-cut research findings in the hope of getting it published in a prestigious journal. RRPs are behaviours a researcher can engage in that can help ensure the quality and trustworthiness of one’s research. What these behaviours have in common, is that they focus on the way research is conducted. Examples include applying validated measurement instruments, consulting with a statistician regarding the appropriateness of the proposed data analysis models, keeping a comprehensive record of the decisions made during the research process and meticulously checking a manuscript to avoid errors [12].

Transparency comes into play when reporting how a study was or will be conducted, for example, by writing up a detailed research protocol before the start of the study and reporting all its results afterwards. Additional examples include open notebooks or open lab books [13], where researchers make the entire process of their research available, not just their protocols or final results. Here, a high level of detail is important. This detailed insight enables the readers or reviewers of the manuscript to draw their own conclusions about the credibility of the findings because they get a complete insight into how the study was set up or conducted.

Open science is an umbrella term. When it comes to what researchers can do to make their work more traceable and verifiable, a major part of open science focuses on how research output is disseminated [14,15,16]. Open science has also broadened the traditional interpretation of what counts as research output: proponents of open science plead for publicly sharing study methods via preregistrations (e.g., via osf.io), depositing or publishing study protocols and data analysis plans in a relevant repository (e.g., protocols.io or plos.org/protocols), publicly sharing code used to analyse the data, the complete data set in itself plus its metadata, and making the study findings rapidly and freely available as a preprint to be, ideally, followed by an open peer-reviewed open access publication.

Let’s review an example research project to see how these concepts strengthen one other. A research team is interested in the effect of Covid-19 restrictions on adolescents. One team member identifies if there are validated, target population-oriented questionnaires available that are relevant for answering the research question of interest (conduct of research). Another team member calculates the appropriate sample size for detecting the effect size of interest (conduct of research). The team then incorporate this information and start drafting a study protocol. They preregister this protocol in a publicly accessible repository that allows reviewers or colleagues to assess whether they have done what they promised to do (reporting of research). The team proceeds with data collection and analysis. They use a reporting guideline to structure writing up their findings and to assure relevant details are included the in the manuscript (reporting of research). Because the team wants to practice open science, they publish their data in a way that assures it is Findable, Accessible, Interoperable and Reusable (FAIR [17] dissemination of research). However, to assure that the data is useful to others, the team devotes great care to describing a comprehensive code book that is linked to their dataset, explaining the metadata and variable names (reporting of research).

Now let’s review an example where research that is open and transparent, but not rigorous. A study can be preregistered with its full study protocol and share its data in a FAIR format, but if those data have been collected using sloppy methods (e.g., the study was not randomized, had a low sample size, was non-blinded, reported irrelevant outcomes or assessed relevant outcomes poorly, etc.), it still is a poor-quality study. Here open science enables greater transparency, allowing others to assess the protocol, method, data, analysis, and conclusions to determine if the study is rigorously done and bias is avoided. In this way we can assess research quality because all parts of the research are open for scrutiny by others.

A study can also have applied rigorous methods and be transparently reported, but if it is not openly accessible, it will only be read and used by a subset of the target community. Most researchers know some way around paywalls, but it could be missed by relevant policy makers, leading to potentially distorted policies or guidelines.

A study can also be rigorously conducted and openly accessible, but if it is not transparently reported, there is a risk that readers do not fully understand the data collection and analysis methods applied. This could lead to flawed interpretations, or perhaps to disregarding a useful study altogether because its credibility is believed to be low.

These examples serve to highlight some of the ways in which RRPs, transparency and open science reinforce one another. RRPs aid in lowering the risk of bias and strengthen the study quality. Open science facilitates transparency by providing the infrastructure for sharing study details (without space, word or paywall limitations). Some open science formats such as preregistration may help detect QRPs such as selective reporting and data-driven modifications of the research protocol and the data-analysis plan since it offers readers full and open access to all study details determined a priori. Transparency on the approach taken and the data generated are crucial for open science practices to be meaningful. Transparent reporting also makes it possible, if necessary, to carry out a replication study [18] or to reuse the data for a pooled data synthesis or to answer other research questions. By doing so, trust in research and researchers may be (justifiably) strengthened.

What can researchers and research institutions do?

The actions described above are ultimately in the hands of individual researchers. But what drives researchers to conduct their work with integrity? Building on the Center for Open Science pyramid of behaviour change [19], we review what researchers and research institutions can do to promote a culture of research integrity.

To make conducting research with integrity possible, it is essential that research institutions have the necessary infrastructure to curate and store data in accordance with FAIR principles.

To make it easy, research institutions should have the right support for researchers. This could range from research data management experts to statisticians. It also means that the systems researchers need to work with to enable long-term storage of data or to request statistical support are user-friendly. In addition, institutions can support their researchers by providing state of the art research integrity training [20]. This training can also be done more informally through community efforts (e.g., ReproducibiliTea (reproducibilitea.org) or Open Science Communities (openscience.nl)). A combination of formally integrated into curriculum or professional development education and more informal initiatives is probably the quickest and most efficient way to achieve a change in culture. In addition, institutions can support their faculty tasked with mentoring more junior researchers with training in performing these responsibilities [21].

To make it normative, individual researchers can further promote cultural change by role modelling rigorous conduct of research or by teaming up in grassroots initiatives that lobby for change at the institutional level (e.g., the United Kingdom Reproducibility Network, ukrn.org). Factors like mentoring for survival (i.e., socialising early career researchers into the ‘art’ of cutting corners with a view to maximize the number of publications, citations, and grants [22]) may undermine research integrity. On the other hand, adherence to scientific norms such as assessing validity based on the research not the researcher and critically appraising research findings before accepting them (also known as Mertonian norms [23]) was shown to reduce the likelihood of QRPs, and research misconduct while promoting RRPs [12]. Other factors like responsible mentoring could also promote research integrity [24].

To make it rewarding, research institutions should pay due attention to reward their researchers for conducting their work with integrity. It has become apparent that if researchers feel treated unfairly by their department or institution, they may be more likely to engage in questionable research practices, or worse, to compensate for this perceived unfairness [12]. Research institutions should have policies and procedures in place to fairly assess researchers [25] and ensure that research integrity is embedded into those policies [26].

Other perverse incentives also play an important role, such as the quantification and commodification of research [27]. This results in assessment systems that makes funding, quantitative scientific output, and number of students important parameters in the financial resources available to universities [28]. There are various international efforts to change researcher assessment systems. One of the most visible is San Francisco Declaration on Research Assessment (DORA, see sfdora.org). A substantial number of research institutions have signed DORA and are committed to implement the DORA recommendations into their internal criteria for promotion, meaning that they also reward researchers that make their work openly and transparently available. More specifically, the Hong Kong Principles [29] outline how the assessment of researchers can be reformed to foster research integrity and open science practices.

Outlook

We have discussed concepts mainly with empirical quantitative research in mind. It is worth noting, however, that there are interesting initiatives going on in parts of the humanities [30]. For example, researchers are trying to perform a replication study in the field of history [31]. In addition, there is discussion about preregistering some forms of qualitative research [32].

Whereas we focused on what individual researchers and research institutions can do to promote the quality and trustworthiness of research, it is fair to say that other stakeholders like journals and funders play an equally important role, particularly in shaping factors at the more systemic level. Ultimately improving trust in research and researchers will require concerted efforts from all stakeholders.

Our take home message is that researchers interested in open science should pay attention to the work of their peers in the adjacent community of research integrity, and vice versa. Both communities are growing and in order to prevent duplicate efforts, it is crucial to keep track of others’ work, and to collaborate more closely in promoting trust in research and researchers.

Availability of data and materials

Not applicable.

Notes

  1. Reproducibility and replicability are often used interchangeably, here we use the formulation based on the report by the National Academies (3), but other conceptualisations exist, too (4).

Abbreviations

FFP:

Falsification, fabrication, and plagiarism

QRPs:

Questionable Research Practices

RRPs:

Responsible Research Practices

FAIR:

Findable, Accessible, Interoperable, Reusable

References

  1. Levelt Committee, Noort Committee, Drenth Committee. Flawed science: The fraudulent research practices of social psychologist Diederik Stapel. 2012.

  2. Baker M. 1500 scientists lift the lid on reproducibility. Nature. 2016;533(7604):452–4.

    Article  CAS  Google Scholar 

  3. National Academies of Sciences Engineering, and Medicine. Reproducibility and Replicability in Science. Washington (DC): National Academies Press (US); May 7, 2019. https://doi.org/10.17226/25303.

  4. Goodman SN, Fanelli D, Ioannidis JP. What does research reproducibility mean? Sci Transl Med. 2016;8(341):341. https://doi.org/10.1126/scitranslmed.aaf5027.

    Article  Google Scholar 

  5. Kozlov M. NIH issues a seismic mandate: share data publicly. Nature. 2022;602(7898):558–9.

    Article  CAS  Google Scholar 

  6. Munafo MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, du Sert NP, et al. A manifesto for reproducible science. Nat Hum Behav. 2017;1:0021.

    Article  Google Scholar 

  7. All European Academies. The European code of conduct for research integrity. 2017.

  8. Bouter L, Horn L, Kleinert S. Research integrity and societal trust in research South African. J Heart;18(2). https://doi.org/10.24170/18-2-4879.

  9. Steneck NH. Institutional and individual responsibilities for integrity in research. Am J Bioeth. 2002;2(4):51–3.

    Article  Google Scholar 

  10. Xie Y, Wang K, Kong Y. Prevalence of research misconduct and questionable research practices: a systematic review and meta-analysis. Sci Eng Ethics. 2021;27(4):41.

    Article  Google Scholar 

  11. Haven TL, Tijdink JK, Pasman HR, Widdershoven G, Ter Riet G, Bouter LM. Researchers’ perceptions of research misbehaviours: a mixed methods study among academic researchers in Amsterdam. Res Integr Peer Rev. 2019;4:25. https://doi.org/10.1186/s41073-019-0081-7.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Gopalakrishna G, Ter Riet G, Vink G, Stoop I, Wicherts JM, Bouter LM. Prevalence of questionable research practices, research misconduct and their potential explanatory factors: a survey among academic researchers in The Netherlands. PLoS ONE. 2022;17(2): e0263023.

    Article  CAS  Google Scholar 

  13. Schapira M, The Open Lab Notebook Consortium and Harding RJ. Open laboratory notebooks: good for science, good for society, good for scientists [version 2; peer review: 2 approved, 1 approved with reservations]. F1000Res 2019, 8:87. https://doi.org/10.12688/f1000research.17710.2.

    Article  PubMed  PubMed Central  Google Scholar 

  14. ERAC Standing Working Group on Open Science and Innovation (SWG OSI). Guideline Report on Research Integrity and Open Science. EUROPEAN UNION. 2021. https://data.consilium.europa.eu/doc/document/ST-1207-2021-INIT/en/pdf.

  15. National Academies of Sciences Engineering and Medicine. Open Science by Design: Realizing a Vision for 21st Century Research. Washington, DC: The National Academies Press; 2018. p. 232.

    Google Scholar 

  16. UNESCO 2021 UNESCO Recommendation on open science. https://unesdoc.unesco.org/ark:/48223/pf0000379949.locale=en. Accessed 2 June 2022.

  17. Wilkinson MD, Dumontier M, Aalbersberg IJ, Appleton G, Axton M, Baak A, et al. The FAIR guiding principles for scientific data management and stewardship. Sci Data. 2016;3:160018. https://doi.org/10.1038/sdata.2016.18.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Errington TM, Denis A, Perfito N, Iorns E, Nosek BA. Challenges for assessing replicability in preclinical cancer biology. Elife. 2021. https://doi.org/10.7554/eLife.67995.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Nosek BA Center for open science 2019 [cited 2022] Available from: https://www.cos.io/blog/strategy-for-culture-change. Accessed 31 May 2022.

  20. Labib K, Evans N, Pizzolato D, Aubert-Bonn N, Widdershoven G, Bouter L, et al. Co-creating research integrity education guidelines for research institutions. MetaArXiv. 2022. https://doi.org/10.31222/osf.io/gh4cn.

  21. Haven T, Bouter L, Mennen L, Tijdink J. Superb supervision: A pilot study on training supervisors to convey responsible research practices onto their PhD candidates. Account Res. 2022;1–18. https://doi.org/10.1080/08989621.2022.2071153.

  22. Roumbanis L. Symbolic violence in academic life: a study on how junior scholars are educated in the art of getting funded. Minerva. 2019;57:22.

    Article  Google Scholar 

  23. Merton RK. Science and technology in a democratic order. J Legal Political Sociol. 1942;1:11.

    Google Scholar 

  24. Abdi S, Pizzolato D, Nemery B, Dierickx K. Educating PhD students in research integrity in Europe. Sci Eng Ethics. 2021;27(1):5.

    Article  Google Scholar 

  25. Aubert Bonn N, Bouter L. Research assessments should recognize responsible research practices: narrative review of a lively debate and promising developments. Metaarxiv. 2021. https://doi.org/10.31222/osf.io/82rmj.

    Article  Google Scholar 

  26. Mejlgaard N, Bouter LM, Gaskell G, Kavouras P, Allum N, Bendtsen AK, et al. Research integrity: nine ways to move from talk to walk. Nature. 2020;586(7829):358–60.

    Article  CAS  Google Scholar 

  27. Edwards M, Roy S. Academic research in the 21st century: maintaining scientific integrity in a climate of perverse Incentives and hypercompetition. Environ Eng Sci. 2017;34(1):10.

    Article  Google Scholar 

  28. Halffman W, Radder H. The academic manifesto: from an occupied to a public university. Minerva. 2015;53(2):165–87.

    Article  Google Scholar 

  29. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong principles for assessing researchers: fostering research integrity. Plos Biol. 2020. https://doi.org/10.1371/journal.pbio.3000737.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Peels R, Bouter L. Replication and trustworthiness. Account Res. 2021. https://doi.org/10.1080/08989621.2021.1963708.

    Article  PubMed  Google Scholar 

  31. Van Eyghen H, Pear R, Peels R, Bouter L, van den Brink G, van Woudenberg R. Testing the relation between religious and scientific reform: a direct replication of John Hedley Brooke’s 1991 study registration 2022. https://doi.org/10.17605/OSF.IO/XNDWT.

  32. Haven TL, Errington TM, Gleditsch KS, van Grootel L, Jacobs AM, Kern FG, et al. Preregistering qualitative research: a Delphi Study. Int J Qual Meth. 2020;19:13.

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

The authors received no specific funding for this manuscript.

Author information

Authors and Affiliations

Authors

Contributions

The authors met and brainstormed about a previous presentation by LB. TH drafted the initial version. LB, GG, JT and DvdS contributed substantially to the design of the arguments presented and the work’s conceptual clarity. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Tamarinde Haven.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Haven, T., Gopalakrishna, G., Tijdink, J. et al. Promoting trust in research and researchers: How open science and research integrity are intertwined. BMC Res Notes 15, 302 (2022). https://doi.org/10.1186/s13104-022-06169-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13104-022-06169-y

Keywords