Skip to main content

What senior academics can do to support reproducible and open research: a short, three-step guide

Abstract

Increasingly, policies are being introduced to reward and recognise open research practices, while the adoption of such practices into research routines is being facilitated by many grassroots initiatives. However, despite this widespread endorsement and support, as well as various efforts led by early career researchers, open research is yet to be widely adopted. For open research to become the norm, initiatives should engage academics from all career stages, particularly senior academics (namely senior lecturers, readers, professors) given their routine involvement in determining the quality of research. Senior academics, however, face unique challenges in implementing policy changes and supporting grassroots initiatives. Given that—like all researchers—senior academics are motivated by self-interest, this paper lays out three feasible steps that senior academics can take to improve the quality and productivity of their research, that also serve to engender open research. These steps include changing (a) hiring criteria, (b) how scholarly outputs are credited, and (c) how we fund and publish in line with open research principles. The guidance we provide is accompanied by material for further reading.

Introduction

Increasing evidence shows that research in the biomedical and social sciences and research more broadly is difficult to replicate and/or reproduce [1,2,3,4,5]. One of the causes of this ‘replication crisis’ is thought to be misplaced incentives that can undermine research quality. For instance, publishers and funders generally give a selective advantage to novel or statistically significant results, thereby devaluing efforts to confirm published research [6, 7]. Further, employment evaluation criteria unduly focus on individual achievement, publication track records, and grant funding acquisition, which can hamper data sharing and collegiality while incentivising publishing in quantity at the cost to quality [8,9,10,11]. Many and varied changes in policies and procedures are seeking to realign incentives to reward transparent, accessible, and reproducible research [12,13,14], while grassroots initiatives are removing barriers to entry in learning and adopting best research practice [1, 15,16,17,18,19,20,21,22,23,24]. However, despite significant support, widespread adoption of open and reproducible research remains elusive [25,26,27,28]. Further, there is little attention paid to how the current research culture contributes to bullying, harassment, mental health, and the resulting rising tide of researchers leaving academia [29].

For open research to become the norm, further engagement and support must come from senior academics given their routine involvement in supervision, peer review, journal editing, hiring, and informing institutional policies. Senior academics are, however, presented with unique social and practical barriers. For example, setting higher quality standards for junior researchers can be negatively perceived as ‘ladder pulling’ [30], while the widely held perception that open research can stifle innovation or long-held academic freedoms can make researchers at all career stages hesitant to change current practices [27, 31,32,33]. Further, applying for grants [34,35,36] and teaching [37] occupy an increasing amount of work time, which means attending training, developing open research practices, or changing long-standing research routines can be costly and therefore deprioritized. Finally, the increasing literature on how to adopt open research is fast becoming overwhelming, contradictory, and mainly tailored to early career researchers [18, 23, 25, 26].

Therefore, we present a short guide highlighting three easy steps to introduce open research ideas and practices into existing research routines while avoiding the barriers mentioned above. These steps include (1) modifying hiring criteria, (2) crediting scholarly outputs with the contributorship model, and (3) securing grant funding and publishing in line with open research. Following the lead of similar initiatives, these steps are designed to appeal to the self-interests of researchers to motivate their engagement with open research practices [23, 38, 39], with a unique focus on the viewpoint of senior academics. This is supplemented by materials for further reading.

Main text

Step 1: Change how you hire

Evidence shows that open research practices confer a competitive advantage in publishing scholarly outputs and acquiring grant funding (see Table 1), meaning that individuals with open research expertise are a desirable asset in lab groups or departments. However, such individuals will likely be missed in hiring and promotion opportunities as a result of the undue weight given to evaluation metrics such as h-indices and journal impact factors [9, 40]. Further, as open research is rarely mentioned in job descriptions, sought-after candidates cannot easily identify potential employers that value open research. Therefore, we encourage senior academics (where possible) to modify their hiring criteria to incorporate open research practices  that support research quality and productivity.

Table 1 Open research practices and the career benefits they confer. Definitions are lifted from [43]

Modelled on a crowd-sourced initiative [41], one feasible approach is to modify desirable/essential person specification criteria to include a track record of one or more open research practices (e.g., open data, open materials/code, pre-registration, open access publication, publishing preprints, and/or open peer review; see Table 1 for definitions). Criteria should be stated clearly and publicly in advertised job descriptions and/or hiring policies, while decisions about which open research practices to include should be made in consultation with faculties/departments to avoid unnecessarily disadvantaging staff/students. For example, where a track record of open access publications is not expected (e.g., for a PhD student/postdoctoral researcher), proxies for productivity or keen engagement in open research can include preprints, open materials, or open peer review. Instructive examples of how this can be achieved can be found here [42] and in our Additional file 1: Table S1.

Step 2: Change authorship to contributorship

The main currency for career progression is authorship on scholarly outputs [11, 61, 62]. As a result, authorship disputes are widespread, leading to delays in submissions, conflicts among collaborators and journal editors [63,64,65], and/or retractions [66,67,68]. Such intense competition over credit for scholarly outputs has significantly disadvantaged those in more precarious positions (such as black and minority ethnic groups, individuals on fixed-term contracts, and women), with 40% of early-career researchers reporting that credit for their work was given to other academics or research staff [29, 69, 70]. As large collaborative projects become the norm, contributions will be more difficult to dissect and therefore authorship-related issues will become more common [71,72,73,74].

Issues with assigning credit for scholarly outputs are in part due to the lack of consensus-based and comprehensive standards. The commonly used standard, the International Committee of Medical Journal Editors (ICMJE; or the Vancouver guidelines), stipulates that authorship is contingent on substantive contributions (e.g., conceptual design, data collection, analysis, or interpretation, drafting and/or revising a manuscript) [75]. Still, ICMJE offers no adequate guidance on contentious issues, such as designating first, last, or corresponding authorship; assigning responsibility for the research; or dealing with large collaborations or other contributions (such as from librarians and statisticians) [76, 77]. These issues can be avoided with contributorship models of authorship, such as the Contributor Roles Taxonomy (CRediT), a consensus-based classification system that distinguishes 14 contributor roles (see Additional file 1: Table S2) that is now adopted in the submission process at leading publishers (e.g., Elsevier, PLoS, Wiley, and Springer) and hundreds of journals [78, 79].

CRediT documents individual contributions to a scholarly output in a standardised, accessible, and discoverable manner. This can be done at any stage in a research project, although the earlier the better to manage expectations of team members and to minimise future authorship issues. The web-based application, Tenzing, automates this process and produces a CRediT-compatible manuscript for publication [80]. Although the contributor roles are fixed, their definitions can be customised to a particular research discipline for clarity. Further, CRediT can provide a useful framework for deciding on authorship designation. For instance, the degree of contribution can be specified as ‘lead’, ‘equal’, or ‘supporting’, which can inform authorship order [71, 73]. Moreover, contributions to ‘data curation’, ‘project administration’, and ‘validation’ can instruct who should be the corresponding author. CRediT also offers unique opportunities to improve productivity, particularly in terms of fostering collaborations, by signalling the expertise of members of your research group, recognising individual contributions to large teams, and acknowledging roles which tend to be overlooked despite providing valuable insight or support (e.g., project administration). See Additional file 1: Table S3.

Step 3: Change how you fund and publish with open research

Funders and journals are beginning to advantage open research practices with novel initiatives and policy changes. Thus, to be in a position of strength, senior academics should engage with open research in seeking funding and publishing their research outputs.

Policy changes

Funders and journals widely endorse the practice of making sure that research data should be ‘as open as possible, as closed as necessary’, with new policies being introduced to further compliance with this practice [81]. Most funders now also require a data management plan (i.e., a detailed specification of how data or materials will be curated, shared, or used) as standard [82]. Data availability statements, indicating where data and materials are available or specifying reasons for exemptions from data-sharing, are also compulsory for submissions to a growing number of journals, including Science, Nature, and the BMJ [83,84,85]. Data can also be archived and shared through data journals (such as, [84,85,86,87,88,89,90]) or in third-party repositories (e.g., GitHub, Open Science Framework, and Zenodo), which allow control over how data and code are used and shared by assigning licences and DOIs [1, 49, 93] (See Additional file 1: Table S4).

Perhaps the most significant and less well-known policy changes concern preprints, which encourage the publication of scholarly outputs in a faster, more impactful, and more accessible manner. A preprint is a time-stamped, non-peer reviewed manuscript made freely and publicly accessible via an online server typically within 72-h of submission (e.g., PsyArXiv, LawArXiv). Thus, the significant time lag between manuscript submission and its publication (median days, 165) [94] and the infeasible journal open access fees [95] do not apply to preprints. Because of faster and wider dissemination, grantees are increasingly required to deposit preprints, particularly if funded research is of significant public health benefit (e.g., Bill & Melinda Gates Foundation) [96]. Further, a majority of journals permit preprints to be shared before or during manuscript submission [97] (Additional file 1: Table S4), presumably due to evidence that journal articles linked to preprints have higher citation rates [53, 55]. Influential journals (e.g., BMJ, The Lancet) and funders (e.g., The National Institutes of Health, Wellcome Trust) are now explicitly stating that preprints can be cited [98, 99]. Preprints can additionally be referenced in researcher track records when applying for funding [96] and included in submissions to the UK Government funding organisation, the Research Excellence Framework [98].

Funding opportunities

The move from funders to investing in open research is set to gather pace, particularly following the invaluable role open research played in the COVID-19 pandemic [100]. However, identifying and keeping track of open research funding opportunities is challenging. We therefore provided key examples of funding opportunities supporting open research in Table 2 and additionally curated a list of funding opportunities obtained by using data scraping, available at https://lorenzada.github.io/openresearch_funding/. In this list, we selected funding opportunities mentioning keywords related to open research (e.g., replication study, reproducible code, preprint), after data scraping was performed from the NIH and UKRI funding websites. Of note, website selection for data scraping was based on whether automated data collection was permitted for a given website. For further information, please refer to the open code at https://github.com/LorenzaDA/openresearch_funding. This list not only illustrates the mounting financial commitment to open research practices and projects from grant funders, but will hopefully encourage senior academics to apply for funding or for them to support applications from early career researchers in their research team.

Table 2 Examples of funding opportunities supporting or rewarding open research, with accompanying text lifted directly from funders’ websites

Outlook

‘We create our culture, invisible though it may be, and we therefore have it collectively within ourselves to change our culture for the better’ ([118], p. 92).

Academic researchers typically aim to reach the highest standards of best research practice, but are hampered by perverse incentives and cultural norms. However, senior academics in particular face additional, unique challenges–especially in terms of prohibitive workloads–that prevent them from supporting or practising open research even though they might view open research as necessary or worthwhile. This is a problem. The success of policies and grassroots initiatives aiming to  normalise open research relies on the collective action of researchers, but only when open research is practised routinely by those in positions of seniority can a positive change in research culture and quality take effect. In this context, we sought to lower barriers of entry into open research for senior academics, and to highlight that open research is advantageous for research grant capture, productivity, and integrity. More remains to be done, but our short, easy-to-follow, three-step guide will hopefully mark the first steps into a wider adoption of open research for many senior academics.

Availability of data and materials

Not applicable.

References

  1. Munafò M, Nosek BA, Bishop DVM, Button KS, Chambers C, du Sert NP, et al. A manifesto for reproducible science. Nat Hum Behav. 2017;1(1):1–9.

    Google Scholar 

  2. Baker M, Dolgin E. Cancer reproducibility project releases first results. Nat News. 2017;541(7637):269.

    CAS  Google Scholar 

  3. Borregaard MK, Hart EM. Towards a more reproducible ecology. Ecography. 2016;39(4):349–53.

    Google Scholar 

  4. Open Science Collaboration. Estimating the reproducibility of psychological science. Science [Internet]. 2015 Aug 28 [cited 2020 Jul 14];349(6251). Available from: https://science.sciencemag.org/content/349/6251/aac4716.

  5. Drucker DJ. Never waste a good crisis: confronting reproducibility in translational research. Cell Metab. 2016;24(3):348–60.

    CAS  PubMed  Google Scholar 

  6. Fanelli D. Negative results are disappearing from most disciplines and countries. Scientometrics. 2012;90(3):891–904.

    Google Scholar 

  7. Smaldino PE, McElreath R. The natural selection of bad science. R Soc Open Sci. 2016;3(9):160384.

    PubMed  PubMed Central  Google Scholar 

  8. John LK, Loewenstein G, Prelec D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol Sci. 2012;23(5):524–32.

    PubMed  Google Scholar 

  9. Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PLOS Biol. 2018;16(3):e2004089.

    PubMed  PubMed Central  Google Scholar 

  10. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong principles for assessing researchers: fostering research integrity. PLOS Biol. 2020;18(7):e3000737.

    CAS  PubMed  PubMed Central  Google Scholar 

  11. Rice DB, Raffoul H, Ioannidis JPA, Moher D. Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities. BMJ. 2020;m2081.

  12. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJ, et al. Promoting an open research culture. Science. 2015;348(6242):1422–5.

    CAS  PubMed  PubMed Central  Google Scholar 

  13. DORA—San Francisco Declaration on Research Assessment (DORA) [Internet]. [cited 2020 Aug 2]. Available from: https://sfdora.org/.

  14. Plan S. Principles and Implementation|Plan S [Internet]. 2020 [cited 2020 Aug 2]. Available from: https://www.coalition-s.org/addendum-to-the-coalition-s-guidance-on-the-implementation-of-plan-s/principles-and-implementation/.

  15. Aczel B, Szaszi B, Sarafoglou A, Kekecs Z, Kucharský Š, Benjamin D, et al. A consensus-based transparency checklist. Nat Hum Behav. 2020;4(1):4–6.

    PubMed  Google Scholar 

  16. Allen C, Mehler DMA. Open Science challenges, benefits and tips in early career and beyond. 2019 [cited 2019 Apr 23]; Available from: https://osf.io/3czyt.

  17. Button KS, Chambers CD, Lawrence N, Munafò MR. Grassroots training for reproducible science: a consortium-based approach to the empirical dissertation. Psychol Learn Teach. 2020;19(1):77–90.

    Google Scholar 

  18. Crüwell S, van Doorn J, Etz A, Makel MC, Moshontz H, Niebaum JC, et al. Seven easy steps to open science. Z Für Psychol. 2019;227(4):237–48.

    Google Scholar 

  19. DeBruine L, Barr D. Data Skills for Reproducible Science [Internet]. Zenodo; 2019 [cited 2020 Aug 2]. Available from: https://zenodo.org/record/3564555/export/csl#.XyaWvpNKh-W.

  20. Etz A, Gronau QF, Dablander F, Edelsbrunner PA, Baribault B. How to become a Bayesian in eight easy steps: an annotated reading list. Psychon Bull Rev. 2018;25(1):219–34.

    PubMed  Google Scholar 

  21. Kathawalla U-K, Silverstein P, Syed M. Easing Into Open Science: A Guide for Graduate Students and Their Advisors. 2020 May 8 [cited 2020 Jul 13]; Available from: https://psyarxiv.com/vzjdp/.

  22. Klein O, Hardwicke TE, Aust F, Breuer J, Danielsson H, Mohr AH, et al. A practical guide for transparency in psychological science. Collabra Psychol. 2018;4(1):20.

    Google Scholar 

  23. McKiernan EC, Bourne PE, Brown CT, Buck S, Kenall A, Lin J, et al. How open science helps researchers succeed. eLife [Internet]. 2016 Jul 7 [cited 2019 Apr 23];5. Available from: https://elifesciences.org/articles/16800.

  24. Sarabipour S, Debat HJ, Emmott E, Burgess SJ, Schwessinger B, Hensel Z. On the value of preprints: an early career researcher perspective. PLOS Biol. 2019;17(2):e3000151.

    CAS  PubMed  PubMed Central  Google Scholar 

  25. Abele-Brehm AE, Gollwitzer M, Steinberg U, Schönbrodt FD. Attitudes toward Open Science and public data sharing: a survey among members of the German Psychological Society. Soc Psychol. 2019;50(4):252–60.

    Google Scholar 

  26. Ali-Khan SE, Harris LW, Gold ER. Motivating participation in open science by examining researcher incentives. In: Rodgers PA, editor. eLife. 2017;6:e29319.

  27. Houtkoop BL, Chambers C, Macleod M, Bishop DVM, Nichols TE, Wagenmakers E-J. Data sharing in psychology: a survey on barriers and preconditions. Adv Methods Pract Psychol Sci. 2018. https://doi.org/10.1177/2515245917751886.

    Article  Google Scholar 

  28. Chin J, Zeiler K, Dilevski N, Holcombe A, Jeffries RG-, Bishop R, et al. The transparency of quantitative empirical legal research (2018–2020). 2021;38.

  29. Wellcome. What researchers think about the culture they work in [Internet]. 2019 [cited 2020 Jun 3]. Available from: https://wellcome.ac.uk/reports/what-researchers-think-about-research-culture.

  30. Poldrack RA. The costs of reproducibility. Neuron. 2019;101(1):11–4.

    CAS  PubMed  Google Scholar 

  31. Fecher B, Friesike S, Hebing M. What drives academic data sharing? PLoS ONE. 2015;10(2):e0118053.

    PubMed  PubMed Central  Google Scholar 

  32. Levin N, Leonelli S, Weckowska D, Castle D, Dupré J. How do scientists define openness? Exploring the relationship between open science policies and research practice. Bull Sci Technol Soc. 2016;36(2):128–41.

    PubMed  PubMed Central  Google Scholar 

  33. Murray F. The Oncomouse that roared: hybrid exchange strategies as a source of distinction at the boundary of overlapping institutions. Am J Sociol. 2010;116(2):341–88.

    Google Scholar 

  34. Gross K, Bergstrom CT. Contest models highlight inherent inefficiencies of scientific funding competitions. PLoS Biol. 2019;17(1):e3000065.

    PubMed  PubMed Central  Google Scholar 

  35. Herbert DL, Barnett AG, Clarke P, Graves N. On the time spent preparing grant proposals: an observational study of Australian researchers. BMJ Open. 2013;3(5):e002800.

    PubMed  PubMed Central  Google Scholar 

  36. von Hippel T, von Hippel C. To apply or not to apply: a survey analysis of grant writing costs and benefits. PLoS ONE. 2015;10(3):e0118494.

    Google Scholar 

  37. Mayo N. Is paid research time a vanishing privilege for modern academics? [Internet]. Times Higher Education (THE). 2019 [cited 2020 Aug 2]. Available from: https://www.timeshighereducation.com/features/paid-research-time-vanishing-privilege-modern-academics.

  38. Markowetz F. Five selfish reasons to work reproducibly. Genome Biol. 2015;16(1). https://doi.org/10.1186/s13059-015-0850-7.

  39. Wagenmakers E-J, Dutilh G. Seven Selfish Reasons for Preregistration. APS Obs [Internet]. 2016 Oct 31 [cited 2020 Feb 5];29(9). Available from: https://www.psychologicalscience.org/observer/seven-selfish-reasons-for-preregistration.

  40. Hammarfelt B. Recognition and reward in the academy: valuing publication oeuvres in biomedicine, economics and history. Aslib J Inf Manag. 2017;69(5):607–23.

    Google Scholar 

  41. Chambers C, Schönbrodt F. Recognising Open Research Practices in Hiring Policies: Modular Certification Initiative Modular Certification Initiative [Internet]. 2017. Available from: https://osf.io/qb7zm/?revision=5012.

  42. Schönbrodt F, Mellor DT, Bergmann C, Penfold N, Westwood S, Lautarescu A, et al. Academic job offers that mentioned open science. 2018 Jan 18 [cited 2021 Nov 30]; Available from: https://osf.io/7jbnt/.

  43. Open Science Framework. Badges to Acknowledge Open Practices [Internet]. OSF; 2013 [cited 2021 Nov 30]. Available from: https://osf.io/tvyxz/.

  44. Colavizza G, Hrynaszkiewicz I, Staden I, Whitaker K, McGillivray B. The citation advantage of linking publications to research data. Wicherts JM, editor. PLoS ONE. 2020;15(4):e0230416.

  45. Tennant JP, Waldner F, Jacques DC, Masuzzo P, Collister LB, Hartgerink ChrisHJ. The academic, economic and societal impacts of Open Access: an evidence-based review. F1000Research [Internet]. 2016 Sep 21 [cited 2020 Aug 3];5. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4837983/.

  46. Boland MR, Karczewski KJ, Tatonetti NP. Ten simple rules to enable multi-site collaborations through data sharing. PLOS Comput Biol. 2017;13(1):e1005278.

    PubMed  PubMed Central  Google Scholar 

  47. Lowndes JSS, Best BD, Scarborough C, Afflerbach JC, Frazier MR, O’Hara CC, et al. Our path to better science in less time using open data science tools. Nat Ecol Evol. 2017;1(6):1–7.

    Google Scholar 

  48. Piwowar HA, Priem J, Larivière V, Alperin JP, Matthias L, Norlander B, et al. The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles. PeerJ [Internet]. 2018 Feb 13 [cited 2020 Aug 3];6. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5815332/.

  49. Cousijn H, Kenall A, Ganley E, Harrison M, Kernohan D, Lemberger T, et al. A data citation roadmap for scientific publishers. Sci Data. 2018;5(1):180259.

    CAS  PubMed  PubMed Central  Google Scholar 

  50. Quintana DS. A synthetic dataset primer for the biobehavioural sciences to promote reproducibility and hypothesis generation. In: Zaidi M, Büchel C, Bishop DVM, editors. eLife. 2020;9:e53275.

  51. Schmidt B, Ross-Hellauer T, van Edig X, Moylan EC. Ten considerations for open peer review. F1000Research [Internet]. 2018 Jun 29 [cited 2020 Aug 4];7. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6073088/.

  52. Johansson MA, Reich NG, Meyers LA, Lipsitch M. Preprints: An underutilized mechanism to accelerate outbreak science. PLoS Med [Internet]. 2018 Apr 3 [cited 2020 Aug 4];15(4). Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5882117/.

  53. Fraser N, Momeni F, Mayr P, Peters I. The effect of bioRxiv preprints on citations and altmetrics. Sci Commun Educ. 2019. https://doi.org/10.1101/673665.

  54. Fu DY, Hughey JJ. Releasing a preprint is associated with more attention and citations for the peer-reviewed article. In: Rodgers P, Amaral O, editors. eLife. 2019;8:e52646.

  55. Learn JR. What bioRxiv’s first 30,000 preprints reveal about biologists. Nature [Internet]. 2019 Jan 22 [cited 2020 Aug 17]; Available from: https://www.nature.com/articles/d41586-019-00199-6.

  56. Stewart SLK, Rinke EM, McGarrigle R, Lynott D, Lautarescu A, Galizzi MM, et al. Pre-registration and registered reports: a primer from UKRN. 2019;5.

  57. Hobson H. Registered reports are an ally to early career researchers. Nat Hum Behav. 2019;3(10):1010.

    PubMed  Google Scholar 

  58. Nosek BA, Lakens D. Registered reports: a method to increase the credibility of published results. Soc Psychol. 2014;45(3):137–41.

    Google Scholar 

  59. Chambers C. What’s next for registered reports? Nature. 2019;573(7773):187–9.

    CAS  PubMed  Google Scholar 

  60. Hummer LT, Singleton Thorn F, Nosek BA, Errington TM. Evaluating Registered Reports: A Naturalistic Comparative Study of Article Impact. 2017 Dec 4 [cited 2020 Aug 4]; Available from: https://osf.io/5y8w7.

  61. van Dijk D, Manor O, Carey LB. Publication metrics and success on the academic job market. Curr Biol. 2014;24(11):R516–7.

    PubMed  Google Scholar 

  62. Walker RL, Sykes L, Hemmelgarn BR, Quan H. Authors’ opinions on publication in relation to annual performance assessment. BMC Med Educ. 2010;10(1):21.

    PubMed  PubMed Central  Google Scholar 

  63. Faulkes Z. Resolving authorship disputes by mediation and arbitration. Res Integr Peer Rev. 2018;3(1):12.

    PubMed  PubMed Central  Google Scholar 

  64. Grove J. What can be done to resolve academic authorship disputes? [Internet]. Times Higher Education (THE). 2020 [cited 2020 Aug 4]. Available from: https://www.timeshighereducation.com/features/what-can-be-done-resolve-academic-authorship-disputes.

  65. Wager E, Fiack S, Graf C, Robinson A, Rowlands I. Science journal editors’ views on publication ethics: results of an international survey. J Med Ethics. 2009;35(6):348–53.

    CAS  PubMed  Google Scholar 

  66. Henriques R. Lab leaders must create open and safe spaces to improve research culture | Wellcome [Internet]. 2020 [cited 2020 Aug 18]. Available from: https://wellcome.ac.uk/news/lab-leaders-must-create-open-and-safe-spaces-improve-research-culture.

  67. Leiserson CE, McVinney C. Lifelong learning: science professors need leadership training. Nat News. 2015;523(7560):279.

    CAS  Google Scholar 

  68. Noorden RV. Some hard numbers on science’s leadership problems. Nature. 2018;557(7705):294–6.

    PubMed  Google Scholar 

  69. Marschke G, Nunez A, Weinberg BA, Yu H. Last place? The intersection of ethnicity, gender, and race in biomedical. AEA Pap Proc Am Econ Assoc. 2018;108(5):222–7.

    Google Scholar 

  70. Street JM, Rogers WA, Israel M, Braunack-Mayer AJ. Credit where credit is due? Regulation, research integrity and the attribution of authorship in the health sciences. Soc Sci Med 1982. 2010;70(9):1458–65.

    Google Scholar 

  71. Allen L, O’Connell A, Kiermer V. How can we ensure visibility and diversity in research contributions? How the Contributor Role Taxonomy (CRediT) is helping the shift from authorship to contributorship. Learn Publ. 2019;32(1):71–4.

    Google Scholar 

  72. Borenstein J, Shamoo AE. Rethinking authorship in the era of collaborative research. Account Res. 2015;22(5):267–83.

    PubMed  Google Scholar 

  73. Brand A, Allen L, Altman M, Hlava M, Scott J. Beyond authorship: attribution, contribution, collaboration, and credit. Learn Publ. 2015;1:28.

    Google Scholar 

  74. Gaeta TJ. Authorship: “Law” and order. Acad Emerg Med. 1999;6(4):297–301.

    CAS  PubMed  Google Scholar 

  75. International Committee of Medical Journal Editors. Defining the Role of Authors and Contributors [Internet]. 2020 [cited 2020 Aug 4]. Available from: http://www.icmje.org/recommendations/browse/roles-and-responsibilities/defining-the-role-of-authors-and-contributors.html.

  76. Holcombe AO. Contributorship, not authorship: use CRediT to indicate who did what. Publications. 2019;7(3):48.

    Google Scholar 

  77. Holcombe AO. Farewell authors, hello contributors. Nature. 2019;571(7764):147–147.

    PubMed  Google Scholar 

  78. CRediT – Contributor Roles Taxonomy [Internet]. [cited 2021 Nov 30]. Available from: https://credit.niso.org/.

  79. Holcombe AO, Kovacs M, Aust F, Aczel B. Documenting contributions to scholarly articles using CRediT and tenzing. PLoS ONE. 2020;15(12):e0244611.

    CAS  PubMed  PubMed Central  Google Scholar 

  80. Holcombe AO, Kovacs M, Aust F, Aczel B. Tenzing: documenting contributorship using CRediT. 2020 Jul 13 [cited 2020 Aug 4]; Available from: https://osf.io/preprints/metaarxiv/b6ywe/.

  81. Couture JL, Blake RE, McDonald G, Ward CL. A funder-imposed data publication requirement seldom inspired data sharing. In: Wicherts JM, editor. PLoS ONE. 2018;13(7):e0199789.

  82. Digital Curation Centre. Overview of funders’ data policies | DCC [Internet]. 2020 [cited 2020 Aug 18]. Available from: https://www.dcc.ac.uk/guidance/policy/overview-funders-data-policies.

  83. Alsheikh-Ali AA, Qureshi W, Al-Mallah MH, Ioannidis JPA. Public availability of published research data in high-impact journals. PLoS ONE. 2011;6(9):e24357.

    CAS  PubMed  PubMed Central  Google Scholar 

  84. Chan A-W, Song F, Vickers A, Jefferson T, Dickersin K, Gøtzsche PC, et al. Increasing value and reducing waste: addressing inaccessible research. Lancet Lond Engl. 2014;383(9913):257–66.

    Google Scholar 

  85. Godlee F, Groves T. The new BMJ policy on sharing data from drug and device trials. BMJ [Internet]. 2012 Nov 20 [cited 2020 Aug 4];345. Available from: https://www.bmj.com/content/345/bmj.e7888.

  86. Harvard Dataverse [Internet]. [cited 2021 Dec 18]. Available from: https://dataverse.harvard.edu/.

  87. Scientific Data [Internet]. Nature. [cited 2021 Dec 18]. Available from: https://www.nature.com/sdata/.

  88. Welcome to DataCite [Internet]. [cited 2021 Dec 18]. Available from: https://datacite.org/.

  89. Figshare—credit for all your research [Internet]. [cited 2021 Dec 18]. Available from: https://figshare.com/.

  90. The Dataverse Project—Dataverse.org [Internet]. [cited 2021 Dec 18]. Available from: https://dataverse.org/home.

  91. The Dryad Digiti Repository [Internet]. [cited 2021 Dec 18]. Available from: https://datadryad.org/stash/our_mission.

  92. Neurodata Without Borders—The Kavli Foundation [Internet]. [cited 2021 Dec 18]. Available from: https://www.nwb.org/.

  93. Popkin G. Data sharing and how it can benefit your scientific career. Nature. 2019;569(7756):445–7.

    CAS  PubMed  Google Scholar 

  94. Royale S. Same Time Next Year: crunching PubMed data [Internet]. quantixed. 2020 [cited 2020 Aug 17]. Available from: https://quantixed.org/2020/05/08/same-time-next-year-crunching-pubmed-data/.

  95. Van Noorden R. Open access: the true cost of science publishing. Nat News. 2013;495(7442):426.

    Google Scholar 

  96. ASAPbio. Funder policies [Internet]. ASAPbio. 2020 [cited 2020 Aug 18]. Available from: https://asapbio.org/funder-policies.

  97. Sherpa Romeo. Welcome to Sherpa Romeo - v2.sherpa [Internet]. 2020 [cited 2020 Aug 18]. Available from: https://v2.sherpa.ac.uk/romeo/.

  98. ASAPbio. Preprints are valid research outputs for REF2021 [Internet]. 2019 [cited 2020 Aug 4]. Available from: https://asapbio.org/preprints-valid-for-ref2021.

  99. Transpose. Transpose: A database of journal policies on peer review, co-reviewing, and preprinting [Internet]. 2020 [cited 2020 Aug 18]. Available from: https://transpose-publishing.github.io/#/.

  100. Department for Business, Energy & Industrial Strategy. UK Research and Development Roadmap [Internet]. 2020 [cited 2020 Aug 4]. Available from: https://www.gov.uk/government/publications/uk-research-and-development-roadmap/uk-research-and-development-roadmap.

  101. Science C for O. Center for Open Science issues 29 grants to develop open tools and services to support scientific research [Internet]. [cited 2021 Dec 18]. Available from: https://www.cos.io/about/news/center-open-science-issues-29-grants-develop-open-tools-and-services-support-scientific-research.

  102. Preregistration Challenge: Plan, Test, Discover. 2015 Apr 20 [cited 2021 Dec 18]; Available from: https://osf.io/x5w7h/.

  103. Open Science (OS) Fund 2020/2021 | NWO [Internet]. [cited 2021 Dec 18]. Available from: https://www.nwo.nl/en/calls/open-science-os-fund-2020/2021.

  104. Award – Einstein Foundation Berlin [Internet]. [cited 2022 Feb 25]. Available from: https://www.einsteinfoundation.de/en/award/.

  105. Fostering Responsible Research Practices - ZonMw [Internet]. [cited 2021 Dec 18]. Available from: https://www.zonmw.nl/en/research-and-results/fundamental-research/programmas/programme-detail/fostering-responsible-research-practices/.

  106. Horizon Europe [Internet]. European Commission - European Commission. [cited 2021 Dec 18]. Available from: https://ec.europa.eu/info/research-and-innovation/funding/funding-opportunities/funding-programmes-and-open-calls/horizon-europe_en.

  107. Open Science Award | Organization for Human Brain Mapping [Internet]. [cited 2021 Dec 18]. Available from: https://www.humanbrainmapping.org/i4a/pages/index.cfm?pageid=3962.

  108. Credibility in neuroscience to be championed through new BNA prize | News | The British Neuroscience Association [Internet]. [cited 2021 Dec 18]. Available from: https://www.bna.org.uk/mediacentre/news/credibility-in-neuroscience-to-be-championed-through-new-bna-prize/.

  109. Leamer-Rosenthal Prizes Nomination Process [Internet]. Berkeley Initiative for Transparency in the Social Sciences. 2015 [cited 2021 Dec 18]. Available from: https://www.bitss.org/lr-prizes/leamer-rosenthal-prizes-nomination-process/.

  110. Mozilla. Seeking Projects at the Intersection of Openness and Science [Internet]. Read, Write, Participate. 2019 [cited 2021 Dec 18]. Available from: https://medium.com/read-write-participate/seeking-projects-at-the-intersection-of-openness-and-science-3f2dd5a1fa00.

  111. Funding Opportunities [Internet]. National Institutes of Health (NIH). [cited 2021 Dec 18]. Available from: https://www.nih.gov/research-training/rigor-reproducibility/funding-opportunities.

  112. Ethical and Responsible Research (ER2)|Beta site for NSF - National Science Foundation [Internet]. [cited 2021 Dec 18]. Available from: https://beta.nsf.gov/funding/opportunities/ethical-and-responsible-research-er2.

  113. Max-Delbrück-Centrum BI für G-C und. NULL and Replication - BIH at Charité [Internet]. [cited 2021 Dec 18]. Available from: https://www.bihealth.org/en/translation/innovation-enabler/quest-center/calls-and-awards/quest-calls-and-awards/null-and-replication.

  114. Shuttleworth Foundation—Applications [Internet]. The Shuttleworth Foundation. [cited 2021 Dec 18]. Available from: https://shuttleworthfoundation.org/apply/.

  115. Open access funding and reporting [Internet]. [cited 2021 Dec 18]. Available from: https://www.ukri.org/manage-your-award/publishing-your-research-findings/open-access-funding-and-reporting/.

  116. Research Enrichment – Public Engagement | Grant Funding [Internet]. Wellcome. [cited 2021 Dec 18]. Available from: https://wellcome.org/grant-funding/schemes/research-enrichment-public-engagement.

  117. Wellcome Data Prizes [Internet]. Wellcome. [cited 2021 Dec 18]. Available from: https://wellcome.org/what-we-do/our-work/data-science-and-health-trustworthy-data-science/wellcome-data-prizes.

  118. Munafò M, Chambers C, Collins A, Fortunato L, Macleod M. Research culture and reproducibility. Trends Cogn Sci. 2020;24(2):91–3.

    PubMed  Google Scholar 

Download references

Acknowledgements

The author would like to thank those who provided helpful feedback on earlier versions of this manuscript, which include Marion Criaud and Sheut-Ling Lam.

Funding

OSK is supported by the National Institute for Health Research (NIHR) Biomedical Research Centre at South London and Maudsley NHS Foundation Trust and King’s College London. AL is supported by the UK Medical Research Council (MR/N013700) and King’s College London member of the MRC Doctoral Training Partnership in Biomedical Sciences. EB is supported by the Sophia Children’s Hospital Research Foundation (SSWO) Project #S18-68 and #S20-48. LD’A is supported by the Netherlands Organization for Scientific Research (NWO-ZonMW: 016.VICI.170.200). SJW is supported by the Action Medical Research (GN2426), Garfield Weston Foundation, National Institute for Health Research (NIHR) Biomedical Research Centre at South London and the Maudsley NHS Foundation Trust, and King’s College London. The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication. The views expressed are those of the author(s) and not necessarily those of the NIHR or the Department of Health and Social Care.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed equally to this paper, hence all authors share co-first authorship, with SJW as senior author since he led the project and writing. Authorship order was randomly allocated to all authors by SJW. Each author made substantial contributions to the conception of the work, has drafted the work, and substantively revised it. All authors have approved the submitted version (and any substantially modified version), have agreed both to be personally accountable for the author's own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature. Conceptualization: OSK, AL, EB, LDA and SJW. Data Curation: LDA. Project Administration: SJW. Resources: AL and LDA. Supervision: SJW. Visualization: OSK, AL, EB, LDA and SJW. Writing—original Draft Preparation: OSK, AL, EB, LDA and SJW. Writing—review and editing: OSK, AL, EB, LDA and SJW. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Samuel J. Westwood.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

All authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1

: Table S1. Examples of open science practices in university policies for hiring and promotion. Table S2. The CRediT Taxonomy of Roles (adapted from 71 in main text). Table S3 Prospective benefits of CRediT (adapted from 71 in main text). Table S4 List of useful online resources to track funding and journal policies regarding open access, preprints, and open data/materials.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kowalczyk, O.S., Lautarescu, A., Blok, E. et al. What senior academics can do to support reproducible and open research: a short, three-step guide. BMC Res Notes 15, 116 (2022). https://doi.org/10.1186/s13104-022-05999-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13104-022-05999-0

Keywords