Skip to main content

Survey participation among general practitioners: comparison between teaching physicians and a random sample



Health scientists strive for a smooth recruitment of physicians for research projects like surveys. Teaching physicians are an easy to approach population that is already affiliated with a university by teaching students in their practice. How do response rates compare between a convenient online survey among teaching physicians and an elaborate postal survey in a random sample of unknown physicians? Data from the TMI-GP study on the use of memory tests in general practice were used.


Physicians in the random sample responded to the postal survey more often than teaching physicians to the online survey (59.5% vs. 18.9%; odds ratio 7.06; 95% confidence interval 4.81–10.37; p < 0.001). Although it is unclear whether the sample, the survey mode (online vs. postal) or both account for this effect, it is noteworthy that even in such a convenience sample of known/committed physicians, an adequate response rate could not be reached without a tailored and elaborated survey technique. Responders in the two samples were comparable regarding a content-related item (use of memory tests; Χ2 (df = 1) = 3.07; p = 0.080).


Surveys are an important research method in health sciences, but poor participation of physicians is an international problem [1,2,3]. Concerning the survey mode, it has been reported, that general practitioners (GPs) and other health care professionals prefer postal over online/email surveys, and mixed method approaches are recommended [4].

All over the world, practicing GPs teach medical students in the discipline of general practice/family medicine in their offices [5,6,7]. This teaching activity is usually appreciated with a small monetary incentive (estimated 100–200 Euro per week per student in Germany). GP trainers (alias teaching physicians) are in regular contact with a university and could therefore perhaps be recruited for research projects with less effort. Especially in the absence of robust GP-based research networks, university institutes resort to their own teaching GPs for research purposes. The advantages of recruiting teaching GPs for research include the convenience of using routines for contacting (established modes and latest addresses), a personal connection and commitment of GPs to the university, and no need for approaching new GPs via address lists from agencies or authorities. Overall, the expectation is that recruiting teaching GPs for research will lead to high participation at little expense. Recruitment of GPs from random samples (without any link to university and/or research) requires more time and economic resources to reach acceptable participation rates [8]. The advantages of random samples include more generalizable and less biased results. Consequently, for researchers it is important to know if low-effort recruitment of teaching GPs and high-effort recruitment of random GPs gain different participation rates. We compared these two approaches in a GP survey with two separate samples: a sample of teaching GPs with convenient contacting by fax or mail using an efficient online survey mode vs. a random sample with an elaborated postal strategy using paper–pencil questionnaires. Data come from a study on the use of tests for memory impairment in general practice (TMI-GP study), itself a complement to the SMI-GP study (subjective memory impairment in general practice) [9]. The TMI-GP survey has not yet been published; an English translation of the questionnaire is attached (see Additional file 1). For the TMI-GP study, the adjunct sample of teaching GPs was used to boost the sample size at lower cost and to obtain appropriate analysis conditions.

Main text


A gender-stratified random sample of 400 GPs was drawn from the database of the Association of Statutory Health Insurance Physicians in the North Rhine Region (KVNO). For the teaching GP sample, four German university institutes of general practice identified all of their teaching GPs in practice: n = 76 Bochum, n = 287 Düsseldorf, n = 140 Münster, n = 86 Rostock.


In the SMI-GP and TMI-GP studies, a questionnaire on the use of memory tests in general practice was constructed as part of a sequential exploratory mixed methods design [10]. Question wording and scaling were in accordance with scientific standards [11] and additionally aided by cognitive interviews [12]. The final questionnaire comprised 11 topics and 68 items (four pages in its paper version).


Both surveys were anonymous, dispatched in early December 2019 and designed according to best practices for invitation, layout, item construction, questionnaire design and reminder strategy [13, 14]. The invitation letter/fax/email was personalised (e.g., individual salutation, photo of the researcher). A reminder letter/fax/email was send in late January 2020, accompanied by a copy of/link to the (paper/online) questionnaire.

Differences between postal and online survey

Dispatch of the postal survey for the random sample GPs was done with coloured envelopes, high-quality paper, handwritten addressing and stamps. A non-monetary unconditional incentive (pen with a “Many thanks!” print) and an addressed return envelope with stamps were enclosed. Dispatch of the online survey for the teaching GPs was done by employees of the four institutes on behalf of the investigators (“We would like to invite you to a survey of our partner institute of general practice in Düsseldorf […] The survey is conducted by student Flora-Marie Hegerath and accompanied by PD Dr. Michael Pentzek.”). The mode of recruitment was chosen by the institutes according to the way each GP is usually approached by the institute for other purposes (by e-mail 73.5%, by fax 26.5%). No incentivisation was offered. The links in emails and faxes led to the online questionnaire, which was realised on the online survey platform SoSci Survey [15].


A returned questionnaire with ≥ 90% item response was considered as participation. The multivariable influence of sample (teaching/random), GP gender and the interaction of both on response rate (response/non-response) was calculated by binary logistic regression. A content-related comparison of reponders from both samples was made on a central item (use of memory tests yes/no) performing a chi-square test.


Table 1 shows the characteristics of both samples.

Table 1 Description of the two general practitioner samples

On one of the central survey items (use of memory tests yes/no), teaching sample GPs and random sample GPs do not differ from each other [Χ2 (df = 1) = 3.07; p = 0.080].

Teaching GPs responded to the online survey less often than the GPs in the random sample to the postal survey (see Fig. 1). In the logistic regression analysis this effect of sample on response rate (Odds ratio OR 7.06; 95% confidence interval CI 4.81–10.37; p < 0.001) is independent from the effects of gender (0.78; 95% CI 0.50–1.21; p = 0.264) and the interaction term gender*sample (OR 0.88; 95% CI 0.48–1.61; p = 0.679).

Fig. 1
figure 1

Response rates of male and female GPs in two samples. GP general practitioner


A convenient, inexpensive and time-saving online survey strategy in a sample of university-affiliated teaching physicians is by far not reaching the response rate of an elaborated postal survey strategy in a random physician sample. This finding, especially to this extent of the difference in response rates (40%), is surprising and requires a discussion of possible causes.

In this study, we compared not only two different samples, but also two different survey strategies in their entirety– one convenient (known GPs, fax/email invitation, online survey tool, no incentives) and one elaborate (unknown GPs, postal dispatch/return with stamps, paper questionnaires, incentives). Thus, our finding can be attributed to the effect of the sample, the effect of the method, or a combination of both (the strategies as a whole).

A sample effect would have been expected in the other direction: Compared with random GP samples, teaching GP samples reportedly yield significantly higher response rates [16]. One substantial reason is the existing affiliation with the research institute and researchers [17]. In our study, the latter effect may have been blunted: Although the teaching GPs were recruited by their affiliated institutes, in three institutes this recruitment was done "on behalf" of another institute. This may have weakened the personal link between GP and researcher. Another possible reason for the lower response rate in the sample of teaching physicians reported here could be the existing workload for student teaching, reducing the commitment for other university-related activities such as research. The importance of work-life balance and a missing incentive for survey participation (opposed to a remuneration of teaching, see Introduction section) may have strengthened this effect.

The survey method may have had an impact on response rates. It is well known that GPs and other health care professionals prefer postal over online/email surveys [13, 18]. The exact reasons for this have not yet been sufficiently investigated [19], which is why the following factors are hypothetically based on unsystematic feedback and personal research experience: Access problems or data protection reasons may hinder the GP to access the online questionnaire from his/her office computer. A general aversion to online surveys may have had an impact (e.g., caused by too many requests or experiences with unserious providers). Invitation by email or fax in the teaching GP sample (vs. letter in the random sample) may have attenuated the response rate: The transfer of the link from fax could have been a barrier. Emails to the practice address may not be read regularly and/or may be read by assistants rather than the GP him/herself. Faxes and emails may not attract much attention because they are no different from the many other faxes and emails that reach the practice every day. The possibilities of personalisation, appreciation, attracting attention, incentivisation etc. can be implemented better or more directly and tangibly with postal surveys (especially with regard to materials such as high-quality paper, stamps, coloured envelopes, handwritten addressing, gifts like pens etc.). These features may attract more attention than a fax or an e-mail with few distinctive attributes.


Method beats sample. Even in a convenience sample of known/committed physicians, adequate response rates could not be reached without a tailored and elaborated survey technique.


In this study, we did not apply a factorial experimental design with distinct varying factors. As we compared two different complex strategies, we cannot differentiate the impact of single survey features (e.g., sample, invitation, mode, incentive) on response rate. Remaining open questions are grounded in this flaw and in how the method may have interacted with the kind of sample to influence response rates. Would the teaching GPs have shown higher response rates when approached with a postal survey? Would the random GPs have shown lower response rates when invited by email or fax? What exactly was the problem with the online survey in the teaching GP sample? What effect would other modes of invitation and survey (e.g., per SMS, messenger) have had on participation? Would different forms of incentives (e.g., donations for social/environmental charity, vouchers, online money transfer) have enhanced the response rate in the online survey?

In concordance with earlier research [16], in the present study the samples of random and teaching GPs were comparable at the content level, which argues in favour of being able to merge samples and analyse them together at the content level. However, this was verified for only one item and in only < 20% of teaching GPs (with high interest in the survey topic?). More complex analyses of GP sample comparability are important to consider in future research.

Research in general practice uses manifold qualitative, quantitative and mixed methods designs. Our results are limited to the participation in a survey. No conclusions about participation of different GP samples in other study designs can be drawn.

We did not conduct a non-response analysis. Possible differences between responders and non-responders in both samples would have yielded further important information. We also missed to ask GPs for their explicit reasons for participation and non-participation.

Availability of data and materials

The dataset used and analysed during the current study is available from the corresponding author on reasonable request.



General practitioner


Study on tests for memory impairment in general practice


Study on subjective memory impairment in general practice




Standard deviation


Odds ratio


Confidence interval


  1. 1.

    Sahin D, Yaffe MJ, Sussman T, McCusker J. A mixed studies literature review of family physicians’ participation in research. Fam Med. 2014;46:503–14.

    PubMed  Google Scholar 

  2. 2.

    Brtnikova M, Crane LA, Allison MA, Hurley LP, Beaty BL, Kempe A. A method for achieving high response rates in national surveys of US primary care physicians. PLoS ONE. 2018;13(8): e0202755.

    Article  Google Scholar 

  3. 3.

    Malik RA, Aldinc E, Chan SP, et al. Perceptions of painful diabetic peripheral neuropathy in South-East Asia: results from patient and physician surveys. Adv Ther. 2017;34:1426–37.

    Article  Google Scholar 

  4. 4.

    Sebo P, Maisonneuve H, Cerutti B, Fournier JP, Senn N, Haller DM. Rates, delays, and completeness of general practitioners’ responses to a postal versus web-based survey: a randomized trial. J Med Internet Res. 2017;19(3):e83.

    Article  Google Scholar 

  5. 5.

    Michels NRM, Maagaard R, Buchanan J, Scherpbier N. Educational training requirements for general practice/family medicine specialty training: recommendations for trainees, trainers and training institutions. Educ Prim Care. 2018;29(6):322–6.

    Article  Google Scholar 

  6. 6.

    Council on graduate medical education. Advancing primary care (twentieth report). 2010. Accessed 9 Dec 2021.

  7. 7.

    Society of Teachers of Family Medicine. 2021. Accessed 9 Dec 2021.

  8. 8.

    VanGeest JB, Johnson TP, Welch VL. Methodologies for improving response rates in surveys of physicians: a systematic review. Eval Health Prof. 2007;30:303–21.

    Article  Google Scholar 

  9. 9.

    Pentzek M, Leve V, Leucht V. Subjective memory impairment in general practice: short overview and design of a mixed methods study. Z Gerontol Geriatr. 2017;50(Suppl 2):48–54.

    Article  Google Scholar 

  10. 10.

    Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. 3rd ed. London: Sage Publications; 2018.

    Google Scholar 

  11. 11.

    Gehlbach H, Artino AR Jr. The survey checklist (Manifesto). Acad Med. 2018;93:360–6.

    Article  Google Scholar 

  12. 12.

    Willis GB. Cognitive interviewing: a tool for improving questionnaire design. Thousand Oaks: Sage Publications; 2004.

    Google Scholar 

  13. 13.

    Pit SW, Vo T, Pyakurel S. The effectiveness of recruitment strategies on general practitioner’s survey response rates—a systematic review. BMC Med Res Methodol. 2014.

    Article  PubMed  PubMed Central  Google Scholar 

  14. 14.

    Dillman DA, Smyth JD, Christian LM. Internet, phone, mail, and mixed-mode surveys: the tailored design method. 4th ed. Hoboken: Wiley; 2014.

    Google Scholar 

  15. 15.

    Leiner DJ. SoSci survey (version 3.1.06). 2019. Accessed 30 July 2021.

  16. 16.

    Viehmann A, Thielmann A, Gesenhues S, Weltermann BM. Do academic family practices reflect routine primary care? Z Allg Med. 2014;90:354–9.

    Google Scholar 

  17. 17.

    Hummers-Pradier E, Scheidt-Nave C, Martin H, Heinemann S, Kochen MM, Himmel W. Simply no time? Barriers to GPs’ participation in primary health care research. Fam Pract. 2008;25:105–12.

    Article  Google Scholar 

  18. 18.

    Meyer VM, Benjamens S, Moumni ME, Lange JFM, Pol RA. Global overview of response rates in patient and health care professional surveys in surgery: a systematic review. Ann Surg. 2020.

    Article  PubMed  Google Scholar 

  19. 19.

    Klabunde CN, Willis GB, Casalino LP. Facilitators and barriers to survey participation by physicians: a call to action for researchers. Eval Health Prof. 2013;36:279–95.

    Article  Google Scholar 

Download references


We thank our colleagues from the following institutes for contacting their teaching GPs: Department of General Practice/Family Medicine, Faculty of Medicine, Ruhr University Bochum (RUB), Germany; Center for General Medicine, Faculty of Medicine, University of Münster, Germany; Institute of General Practice, University Medical Center Rostock, Germany.


Open Access funding enabled and organized by Projekt DEAL. The TMI-GP study was partly funded by the Research Committee of the Medical Faculty, Heinrich Heine University Düsseldorf (Grant No. 43/2015).

Author information




MP and VB designed the study; all authors constructed the questionnaire; all authors planned the data collection; VB and FMH conducted the survey; MP and FMH analysed the data; MP wrote the first draft. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Michael Pentzek.

Ethics declarations

Ethics approval and consent to participate

The TMI-GP study has received a positive ethical vote from the ethics committee of the Medical Faculty of Heinrich Heine University Düsseldorf [amendment from 18th Nov 2019 to ethical vote 4848 (SMI-GP)]. GPs were informed about the study in written form; participation was voluntary and anonymous. Consent was implied on the return of the completed survey.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

TMI-GP questionnaire (Tests for memory impairment in general practice). Items—English translation of German original.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pentzek, M., Baumgart, V. & Hegerath, FM. Survey participation among general practitioners: comparison between teaching physicians and a random sample. BMC Res Notes 15, 9 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI:


  • Recruitment
  • General practice
  • Research participation
  • Response rate
  • Postal survey
  • Online survey
  • Teaching physician