Skip to main content
  • Research note
  • Open access
  • Published:

The failure of success: four lessons learned in five years of research on research integrity and research assessments

Abstract

In the past 5 years, we captured the perspectives from a broad array of research stakeholders to better understand the impact that current approaches to success and research assessment may have on the integrity and the quality of research. Here, we translate our findings in four actions that are urgently needed to foster better research. First, we need to address core research structures to overcome systemic problems of the research enterprise; second, we must realign research assessments to value elements that advance and strengthen science; third, we need to remodel, diversify, and secure research careers; and finally, we need to unite and coordinate efforts for change.

Introduction

To succeed in science, researchers need to maximize the number of grants they receive, the number of articles they publish, and the bibliometric impact they obtain on their scientific output. These achievements are generally measured with narrow and decontextualized metrics that fail to capture creativity, innovation, openness, and quality [1]. Demands for high-competition, high-output, and high-impact research can also lead to poor quality research and research waste, inspire breaches of research integrity, and tax the wellbeing of researchers.

Main text

In the past 5 years, we captured the perspectives of a broad array of those with a stake in the production of scientific knowledge to better understand the impact that our current approach to success may have on the integrity and the quality of research. Our findings–based on a thorough literature analysis [2], stakeholder interviews and focus groups [3, 4], and a survey of researchers [1] (see Fig. 1)–provide new insights that translate to four recommendations for overcoming the current problems that plague research: restructuring the organization of research, realigning research assessments, remodeling research careers, and recognizing and coordinating efforts to move from reaction to action (Fig. 2). The full methods, results, and materials used in the different steps of our research are published in separate papers [1,2,3,4], and further material is available on the Open Science Framework [5].

Fig. 1
figure 1

The three core methodologies used to gather our findings

Fig. 2
figure 2

Four recommendations to help overcome the current problems of science

Restructure the organization of research

Our project began with an analysis of the literature on research integrity, where we found a disconnect between evidence and practice. While research integrity is most often shown to be disturbed by issues within research systems such as competition, pressure, and incentives, approaches to foster integrity generally focus on researchers’ awareness and compliance and not on the systemic problems of academia [2].

Without discrediting the value of integrity training, codes of conduct, whistleblowing protection, and oversight in building a solid culture of integrity among researchers, our findings show that the promotion of research integrity requires interventions that address systemic imperfections of the research enterprise, including, most importantly, the incentives and reward structures of research.

Realign research assessments

Echoing statements such as the San Francisco Declaration on Research Assessment (DORA; 6), the Leiden Manifesto for research metrics [7], The Metric Tide [8], and the Hong Kong Principles for Assessing Researchers [9], our findings show that research assessments must change in a way that values and fosters the integrity and the quality of research. We found, for example, that researchers believe that the indicators now being used to assess research careers do not align with the indicators that are important for advancing science [1]. This finding agrees with a broad body of research that demonstrates the inadequacy of current indicators for capturing social impact [10, 11], innovation [12], replicability [13], and quality [14].

Furthermore, an overemphasis on outputs, quantity, and ground-breaking results discourages high quality research and overlooks the importance of negative results and the need for replication in research [4]. The increasing prominence of project-based research funding further deepens the problem by exacerbating pressures on individuals and by giving a short-term mindset to research processes. Our findings provide empirical support to the strong momentum for change visible in ongoing efforts to encourage more responsible use of metrics, broader consideration of diverse research activities, and greater recognition of processes such as quality, openness, and transparency (many of these ongoing initiatives are described in 15, 16). But realigning research assessments also requires interventions that go beyond changes in research institutions. Indeed, performance-based research funding and university rankings at national and international levels have a powerful influence on the perspectives of success and excellence [17, 18] and realigning these high-level assessments with integrity and high quality research is equally important in improving science.

Diversify and secure research careers

It is also important to consider the person behind the research. At the moment, only ten to twenty percent of PhD students will be able to secure a permanent position in academia although most aspire to an academic career [19,20,21,22]. This is not a new problem. The issue has been raised for more than twenty years with very little change [23, 24]. One of the strengths of our project was the inclusion of past researchers who have found careers outside academia. In hearing their stories, we understood that leaving academia can generate a vivid wound and leave a strong feeling of failure [3]. The scarce opportunities for employment also increase pressure and competition between early career researchers thus isolating them, jeopardizing their mental health [25, 26], and requiring them to outpace their colleagues to survive in their academic career. To move ahead of colleagues, researchers need to focus on outputs and ignore processes that are not rewarded, even if many of these processes are essential in advancing science [4]. Highly selective research careers also block diversity, not only in terms of gender and ethnicity, but also in terms of skills and career profiles of those who succeed. As a result, academic research environments are shaped by a uniform research culture that is highly resistant to change. There is an urgent need to address research careers and employment insecurity. Research institutions and doctoral schools need to provide early career researchers with better opportunities to develop transferable skills and connect with non-academic sectors. But academic careers themselves would also benefit from greater differentiation, including diverse roles within academia where unique skills and profiles are acknowledged, incentivized, and rewarded and where collaborative teams and diverse interpretations of success are considered [27].

Recognize disparate voices and coordinate actions

Our project involved a wide array of stakeholders including policy makers, research funders, research institution leaders, editors and publishers, research integrity office members, early-, mid-, and late-career researchers, research students, laboratory technicians, and researchers who left academia. In hearing the voices of so many different stakeholders, we realized that perspectives of success, integrity, and misconduct differ between individuals and that the problems and actions needed are interpreted differently by different stakeholders. We also found that the responsibility for actions is often passed from one actor to the next, creating a stagnant system characterized by blame, hopelessness, and inaction [3]. Despite this discouraging picture, the past few years have seen an emergence of working groups and networks of researchers eager to change and move from discussion to action. With over 20,000 signatories–2500 of which are organizations–the San Francisco Declaration on Research Assessments is a perfect example of the emergent mobilization, a movement that has captured the attention of important funders such as the Wellcome Trust in the UK, the Canadian federal Tri-Agency, and the Australian National Health and Medical Research Council, among many others. The dialogue is also increasingly diverse, merging the voices of different stakeholders who are willing to join forces to make research better [15]. But the voices of former-researchers and early career scientists whose perspectives may be very different than that of those who survived and succeeded in the current system are often missed. For broad, systemic changes to be operationalized, we need to understand the dynamics and the relationships at play in the current problems as experienced by all actors involved. We need to dig deeper in the spaces and responsibilities that link different actors and that build the foundations of our shared concepts of excellence and integrity. Broad expert groups such as the European Commission Policy Platforms or expert groups created by Scientific Societies and Academies, for example, provide a venue where the opinions of different actors meet and influence those who make science policy. Ensuring that these platforms include the full diversity of voices is the next logical step to ensuring a proactive dialogue.

From reaction to action

These four recommendations suggest that the very foundations of research systems need to be addressed. Although daunting, we are confident that change is possible. Over the course of the 5 years of this project, much has happened in the field of research integrity and especially in the area of research assessment. As we were conducting our research, new developments, assessment initiatives, position documents, and influential opinions on the topic were emerging nearly every week, giving us great hope for the future. Nevertheless, our research suggests that topical initiatives will only realize their full potential and change research culture if they generate broad and coordinated approaches for change. Recent actions in this direction are promising. Hints at global change can be found in the recent ‘Agreement on Reforming Research Assessment’ supported by the European Commission, Science Europe, and the European University Association [28], in the Global Research Council ‘Responsible Research Assessment – Call to Action’ [29], and in statements from wide-reaching multi-stakeholder programs such as the ‘G7 2021 Research Compact’ [30] and the ‘UNESCO Recommendation on Open Science’ [31]. Now is the time for a shift from discussing what needs to change to enacting change.

Limitations

The results from the focus groups, interviews, and survey reported in this short Research Note principally came from stakeholders involved with Flemish (Belgium) biomedical research. For this reason, some of the results may be specifically relevant to Flemish research or to biomedical sciences, and may not apply to different settings. However, the high compatibility of our findings with current research and policy efforts (see for example 8, 32, 33–38) suggests that different settings and disciplines share a similar perspective of the problems and changes needed than the participants of our research. Additional and more detailed limitations for the different empirical steps reported in this Research Note are available in the respective papers in which the full findings are reported [1,2,3,4].

Availability of data and materials

The data that support the findings from the survey study are available in the associated paper (see 1). The data that supports the results from the interviews and focus groups study are not publicly available due to the risk of identification of participants, but extensive quotes are available in the papers describing the findings [3, 4]. The survey and interview or focus group guides used to gather the data are available in their associated papers [1, 3, 4], and additional material such as consent forms and participant information sheet are available in the Open Science Framework (5).

Abbreviations

DORA:

San Francisco declaration on research assessment

References

  1. Aubert Bonn N, Pinxten W. Advancing science or advancing careers? researchers’ opinions on success indicators. PLoS ONE. 2021;16(2):e0243664. https://doi.org/10.1371/journal.pone.0243664

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  2. Aubert Bonn N, Pinxten W. A decade of empirical research on research integrity: what have we (not) looked at? J Empir Res Hum Res Ethics. 2019;14(4):338–52. https://doi.org/10.1371/journal.pone.024366410.1177/1556264619858534

    Article  PubMed  Google Scholar 

  3. Aubert Bonn N, Pinxten W. Rethinking success, integrity, and culture in research (part 2)—a multi-actor qualitative study on problems of science. Res Integrity Peer Rev. 2021;6(1):3. https://doi.org/10.1371/journal.pone.024366410.1186/s41073-020-00105-z

    Article  Google Scholar 

  4. Aubert Bonn N, Pinxten W. Rethinking success, integrity, and culture in research (part 1)—a multi-actor qualitative study on success in science. Res Integr Peer Rev. 2021;6(1):1. https://doi.org/10.1371/journal.pone.024366410.1186/s41073-020-00104-0

    Article  PubMed  PubMed Central  Google Scholar 

  5. Rethinking success, integrity, and culture in science (Re-SInC) open science framework https://osf.io/ap4kn/.

  6. American Society for Cell Biology. San Francisco declaration on research assessment. 2013. https://sfdora.org/read/.

  7. Hicks D, Wouters P, Waltman L, Rijcke Sd, Rafols I. The Leiden Manifesto for research metrics. Nature News. 2015;520:429–31. https://doi.org/10.1038/520429a.

    Article  Google Scholar 

  8. Wilsdon J, Liz Allen, Belfiore E, Campbell P, Curry S, Hill S, et al. The metric tide: report of the independent review of the role of metrics in research assessment and management. HEFCE; 2015. https://re.ukri.org/documents/hefce-documents/metric-tide-2015-pdf/.

  9. Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong principles for assessing researchers: fostering research integrity. PloS Biol. 2020;18(7):e3000737. https://doi.org/10.1371/journal.pbio.3000737

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  10. Alperin JP, Muñoz Nieves C, Schimanski LA, Fischman GE, Niles MT, McKiernan EC. How significant are the public dimensions of faculty work in review, promotion and tenure documents? Elife. 2019;8:e42254. https://doi.org/10.7554/eLife.42254

    Article  PubMed  PubMed Central  Google Scholar 

  11. Lebel J, McLean R. A better measure of research from the global south. Nature. 2018;559:23–6. https://doi.org/10.1038/d41586-018-05581-4

    Article  CAS  PubMed  Google Scholar 

  12. Schmidt R. Behavioral scientist. 2020 https://behavioralscientist.org/the-benefits-of-statistical-noise/.

  13. Serra-Garcia M, Gneezy U. Nonreplicable publications are cited more than replicable ones. Sci Adv. 2021. https://doi.org/10.1126/sciadv.abd1705.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Brembs B, Button K, Munafò M. Deep impact unintended consequences of journal rank. Front Hum Neurosci. 2013;7:291. https://doi.org/10.3389/fnhum.2013.00291

    Article  PubMed  PubMed Central  Google Scholar 

  15. Aubert Bonn N, Bouter L. Research assessments should recognize responsible research practices—narrative review of a lively debate and promising developments. MetaArXiv. 2021. https://doi.org/10.31222/osf.io/82rmj.

    Article  Google Scholar 

  16. Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, Goodman SN. Assessing scientists for hiring, promotion, and tenure. PloS Biol. 2018;16(3):e2004089. https://doi.org/10.1371/journal.pbio.2004089

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  17. Gadd E. University rankings need a rethink. Nature. 2020;587(7835):523. https://doi.org/10.1038/d41586-020-03312-2

    Article  CAS  PubMed  Google Scholar 

  18. de Rijcke S, Wouters PF, Rushforth AD, Franssen TP, Hammarfelt B. Evaluation practices and effects of indicator use—a literature review. Res Evaluation. 2015;25(2):161–9. https://doi.org/10.1093/reseval/rvv038

    Article  Google Scholar 

  19. Debacker N, Vandevelde K. From PhD to professor in flanders. ECOOM Brief (no 11). 2016. https://biblio.ugent.be/publication/8043010.

  20. Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci. 2014;111(16):5773. https://doi.org/10.1073/pnas.1404402111

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  21. Sauermann H, Roach M. Science PhD career preferences: levels, changes, and advisor encouragement. PLoS ONE. 2012;7(5):e36307. https://doi.org/10.1371/journal.pone.0036307

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  22. van der Weijden I, Teelken C, de Boer M, Drost M. Career satisfaction of postdoctoral researchers in relation to their expectations for the future. High Educ. 2016;72(1):25–40. https://doi.org/10.1007/s10734-015-9936-0

    Article  Google Scholar 

  23. Marincola E, Solomon F. The career structure in biomedical research: implications for training and trainees the american society for cell biology survey on the state of the profession. Mol Biol Cell. 1998;9(11):3003–6. https://doi.org/10.1091/mbc.9.11.3003

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  24. Alberts B. Are our universities producing too many PhDs? Trends Cell Biol. 1999;9(12):M73–5. https://doi.org/10.1016/S0962-8924(99)01686-4

    Article  CAS  PubMed  Google Scholar 

  25. Levecque K, Anseel F, De Beuckelaer A, Van der Heyden J, Gisle L. Work organization and mental health problems in PhD students. Res Policy. 2017;46(4):868–79. https://doi.org/10.1016/j.respol.2017.02.008

    Article  Google Scholar 

  26. Woolston C. Graduate survey: a love–hurt relationship. Nature. 2017;550(7677):549–52. https://doi.org/10.1038/nj7677-549a

    Article  Google Scholar 

  27. Alberts B, Kirschner MW, Tilghman S, Varmus H. Opinion: Addressing systemic problems in the biomedical research enterprise. Proc Natl Acad Sci. 2015;112(7):1912. https://doi.org/10.1073/pnas.1500969112

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  28. European University Association, Science Europe, European Commission, Stroobants K. Agreement on reforming research assessment. 2022 https://research-and-innovation.ec.europa.eu/system/files/2022-07/rra-agreement-2022.pdf.

  29. Global Research Council. Responsible research assessment—Call to Action. 2021. https://globalresearchcouncil.org/fileadmin//documents/GRC_Publications/RRA_Call_to_Action/RRA_Call_to_Action_English.pdf.

  30. G7 2021 Research Compact. 2021 12 https://www.gov.uk/government/publications/g7-2021-research-compact/g7-2021-research-compact.

  31. UNESCO. UNESCO recommendation on open science. Paris; 2021. PCB-SPP/2021/OS/UROS. https://en.unesco.org/science-sustainable-future/open-science/recommendation.

  32. Fraser C, Nienaltowski M-H, Goff KP, Firth C, Sharman B, Bright M, et al. Responsible research assessment—global research council (GRC) conference report 2021. https://www.globalresearchcouncil.org/fileadmin/documents/GRC_Publications/GRC_RRA_Conference_Summary_Report.pdf.

  33. Directorate-General for Research and Innovation (European Commission). Towards a reform of the research assessment system. 2021 https://op.europa.eu/s/vhB7.

  34. Science Europe. Position statement and recommendations on research assessment processes. 2020 https://doi.org/10.5281/zenodo.4916155.

  35. European University Association. EUA roadmap on research assessment in the transition to open science. 2018. https://eua.eu/downloads/publications/eua-roadmap-on-research-assessment-in-the-transition-to-open-science_v20-08-2019.pdf.

  36. Saenen B, Borell-Damián L. EUA briefing—reflections on university research assessment: key concepts, issues and actors. 2019. https://eua.eu/resources/publications/825:reflections-on-university-research-assessment-key-concepts,-issues-and-actors.html.

  37. ISE task force on researchers’ careers. Position on precarity of academic careers. initiative for science in Europe; 2020. https://initiative-se.eu/wp-content/uploads/2021/02/Research-Precarity-ISE-position.pdf.

  38. Latin American Forum for Research Assessment (FOLEC). towards a transformation of scientific research assessment in latin america and the caribbean: evaluating scientivic research assessment. latin american council of social sciences (CLACSO); 2020. https://www.clacso.org/en/una-nueva-evaluacion-academica-para-una-ciencia-con-relevancia-social/.

Download references

Acknowledgements

The authors wish to thank all those who participated in the interviews, focus groups, and survey as well as those who contributed to each of these empirical projects and enabled us to build these recommendations. Individual contributors are named in individual articles.

Funding

The different steps leading to the results reported here were funded by Bijzonder Onderzoeksfonds (BOF) through UHasselt University Grant 15NI05 (WP, NAB).

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: NAB, RDV, WP; Funding acquisition: WP; Project administration: NAB, WP; Supervision: WP; Writing—original draft: NAB; Writing—review & editing: NAB, RDV, WP. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Noémie Aubert Bonn.

Ethics declarations

Ethics approval and consent to participate

The interviews and focus groups, as well as the survey were approved by the Medical Ethics Committee of the Faculty of Medicine and Life Sciences of Hasselt University (protocol number CME2016/679 for the interviews and focus group study, and protocol number CME2019/035 for the survey study). All participants to the interviews and focus groups provided written consent for participation.

Consent for publication

Not applicable.

Competing interests

The authors have no formal competing interests with the information contained in this article. NAB is currently working on several projects concerning research assessment through her employments as policy advisor at Research England (UKRI) and as postdoctoral researcher at Amsterdam University Medical Centers in which numerous research integrity activities and projects are taking place, but she did not identify a conflicting interest between this short Research Note and her employment. NAB and WP continue to work on the topic of research assessments at Hasselt University and therefore have an intellectual interest in the topic and in new developments in academia, but they have no formal conflicting interests related to this short Research Note.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Aubert Bonn, N., De Vries, R.G. & Pinxten, W. The failure of success: four lessons learned in five years of research on research integrity and research assessments. BMC Res Notes 15, 309 (2022). https://doi.org/10.1186/s13104-022-06191-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13104-022-06191-0

Keywords