Skip to main content
  • Research article
  • Open access
  • Published:

E-learning programs in oncology: a nationwide experience from 2005 to 2014

Abstract

Background

E-learning is an established concept in oncological education and training. However, there seems to be a scarcity of long-term assessments of E-learning programs in oncology vis-á-vis their structural management and didactic value. This study presents descriptive, nationwide data from 2005 to 2014. E-learning oncology programs in chemotherapy, general oncology, pain management, palliative care, psycho-social-oncology, and radiotherapy, were reviewed from our databases. Questionnaires of self-perceived didactic value of the programs were examined 2008–2014.

Results

The total number of trainees were 4693, allocated to 3889 individuals. The trainees included medical doctors (MDs; n = 759), registered nurses (RNs; n = 2359), radiation therapy technologists (n = 642), and, social and health care assistants (SHCAs; n = 933). The E-learning covered 29 different program classifications, comprising 731 recorded presentations, and covering 438 themes. A total of 490 programs were completed by the trainees. The European Credit Transfer and Accumulation System (ECTS; 1 ECTS point equals 0.60 US College Credit Hours) points varied across the educational programs from 0.7 to 30.0, corresponding to a duration of full-time studies ranging between 15 to 900 h (0.4–24 weeks) per program. The total number of ECTS points for the trainee cohort, was 20,000 corresponding to 530,000 full-time academic hours or 324.0 standard academic working years. The overall drop-out rate, across professions and programs, was 10.6% (499/4693). The lowest drop-out rate was seen for RNs (4.3%; P < 0.0001). Self-reported evaluation questionnaires (2008–2014) were completed by 72.1% (2642/3666) of the trainees. The programs were overall rated, on a 5-categorical scale (5 = excellent; 1 = very inferior), as excellent by 68.6% (MDs: 64.9%; RNs: 66.8%; SHCAs: 77.7%) and as good by 30.6% (MDs: 34.5%; RNs: 32.4%; SHCAs: 21.5%) of the responders.

Conclusions

This descriptive study, performed in a lengthy timeframe, presents high-volume data from multi-professional, oncological E-learning programs. While the E-learning paradigm, across professions, seems to have been well received, it is imperative that prospective studies, benchmarking against traditional training methods, are carried out, examining the hypothesized didactic value of our E-programs.

Background

E-learning, also called computer-based learning, online learning or web-based learning, is a ubiquitously used technology in higher education [13]. E-learning comprises internet-based, interactive and asynchronous teaching and learning tools. A number of open universities, some even ‘mega-universities’ with more than 100,000 students, have adopted these as standard education techniques. US National Center for Educational Statistics, in the 2011 report, states that 30% of all students with bachelor’s degrees (n = 860,000) were enrolled in distance education courses with 75% of these taking their entire postgraduate program online [4, 5]. Probably, more than 80% of U.S. doctoral/research institutions have some form of online offering, either courses or full programs [6].

A number of advantages of E-learning programs, vis-á-vis traditional education programs, have been hypothesized: uncoupling of education from time and place, standardization of instruction and assessment, ease of documentation of learner behavior, student control of the education experience and increased educational cost-effectiveness [7]. A recent meta-analysis [8] indicates that no-intervention studies of internet-based learning, using pre- and post-tests, demonstrated augmented effectiveness in relation to acquisition of knowledge (factual or conceptual understanding) and skills. However, in studies comparing internet-based learning with non-internet-based traditional methods there only seems to be a small educational benefit from internet-based learning [8, 9]: a likely explanation is the large heterogeneity and variance in the data, characterizing the studies.

Oncology is indeed a challenging subject matter not only for any trainee, but also for the educator [2]. The E-learning programs may overcome some of the difficulties seen with traditional learning programs by allowing flexibility in time, place, and pace, for the clinically working trainee and educator [10]. There is a paucity of detailed, descriptive long-term data regarding management and outcome of E-learning programs. This study presents outcome data from our E-learning programs in oncology: chemotherapy; general oncology; pain management; palliative care; psycho-social-oncology; and radiotherapy, covering a span of 10 years, and including various health-care professions: medical doctors; nurses; radiation therapy technologists; and social and health care assistants.

Methods

Ethics

The E-programs and the associated data-bases complied with Swedish laws and regulations stipulated in the Personal Data Act (1998:204), aiming to prevent the violation of personal integrity in the processing of personal data [11]. Since the study was retrospective and registry-based, an application to the ethical committee was not considered necessary.

Organization behind the E-learning programs

The organization behind the E-learning programs (E-programs) was established in 2002 by the authors (JD, SS), affiliated to the Department of Oncology, Clinical Sciences, Lund University. Initially, the E-programs were made and driven by local specialists: oncologists, radiation therapy technologists, medical physicists and anesthesiologists. After some years of benchmarking, both regional and national teaching resources were successively recruited into the E-programs. The supplier of the E-programs was the Department of Oncology, University Hospital of the Southern Region, in collaboration with the Medical Faculty at Lund University.

Commissioning parties

The primary commissioning parties were local (n = 17), or regional, university-based (n = 7), oncological departments in Sweden.

Financial aspects

The trainees’ tuition fees were reimbursed by their affiliated oncological departments, in turn recompensed by the respective County Council [Sweden comprises 21 County Councils (across six Health Care Regions) responsible for financing and providing health care managed by the Ministry of Health and Social Affairs]. In 7 out of 20 E-programs, co-financing with the Swedish Medical Association, the National Board of Health and Welfare, the Association of County Councils, and the National Agency for Higher Vocational Education, were agreed upon.

Structure of the education

The structure included four hierarchical levels: Educational Fields, Educational Sub-Fields, Programs, and Modules (Fig. 1). The Educational Fields covered Chemotherapy, Oncology, Radiotherapy and Symptom Therapy. The Educational Sub-Fields covered for the Educational Fields, Oncology: Basic Oncology, Specialized Oncology; Radiotherapy: Basic Radiotherapy, Image Guided Radiotherapy; and for Symptom Therapy: Palliative Care; Pain Management; Psychosocial Oncology. The Educational Field, Chemotherapy, however, did not accommodate any Educational Sub-Field (Fig. 1).

Fig. 1
figure 1

Educational Fields, Educational Sub-Fields, Programs and number of Modules in each E-program. Programs are indicated by name, number of ECTS-points (European Credit Transfer and Accumulation System points or equivalents; 1 ECTS-point corresponds to 25–30 h of study, equivalent to one Swedish University College point or 0.60 US College Credit Hours). The total number of Modules in the Programs are 731. HCA health care assistants, HVE higher vocational education, IGRT image-guided radiotherapy, RT radiotherapy, SHCA social and health care assistants

The Programs, included a categorization of topics based on trainees’ profession, e.g., Basic Oncology for social and health care assistants, Radiotherapy for registered nurses, and, Radiotherapy for medical doctors. The trainees’ professions and required proficiency levels across Educational Sub-Fields, and the duration of individual E-programs, are presented in Table 1. A detailed description of an E-program, representative in Radiotherapy for medical residents in oncology is illustrated in Table 2. Illustrations of E-programs from each of the four Educational Fields are presented in Table 3. Finally, Modules constitute the basic educational building blocks, usually aggregated in a number of themes (Table 2).

Table 1 The Educational Fields, trainees’ health care professions, trainees’ proficiency levels, Program duration and ECTS-points [European Credit Transfer and Accumulation System or equivalent scoring system (1 ECTS-point corresponds to 25–30 h of study)], start year of the Programs and number of Programs given (n = 490)
Table 2 A representative sample of an E-program: “Basic radiophysics and radiotherapy for residents in oncology”
Table 3 Various samples of E-programs from each Educational Field in regard to the trainees’ profession, scheduled Program duration, maximum allowed study time, the number of physical meetings and the duration of each meeting

Professions

Health care professions included were, registered nurses (RNs), medical doctors (MDs), social and health care assistants (SHCAs; 7% of SHCAs belonged to administrative personnel), and radiation therapy technologists (RTTs). The trainees’ proficiencies were classified into basic, specialist-in-training or specialist levels (Table 1).

Recruitment

Three different recruitment paths were utilized. Most commonly an educational contract between the trainee’s oncological department, mainly university-based departments, and the supplier of the E-program, was established, stipulating the educational requirements. Recruitment by advertisements in journals, flyers or web-sites for health care professionals were also utilized. A substantial number of trainees were also recruited by word-of-mouth and selected from waiting lists. Trainees were affiliated to anesthesiological, medical, oncological or surgical departments.

Tutors and producers

Professions

Tutors were senior ranking clinical oncologists and anesthesiologists, many with an academic affiliation. Producers of the educational material included specialists from various disciplines: advanced practice registered nurses, medical doctors, physicists and psychologists, almost all with senior clinical background in addition to an academic affiliation.

Recruitment

Tutors and producers were recruited nationwide from professional, academic networks [JD (palliative care; psychosocial oncology; radiotherapy), EK (oncology; radiotherapy), MW (pain management)].

Basic structure of the module

Design

The design of the Module was based on theory-derived principles of educational practice, following the recommendations on web-based learning by Cook and Dupras [12]. The goals and objectives of each Module were pre-specified and presented at an early stage to the trainee, the IT-platform was successively tailored to the needs of the trainees and the tutors, appropriate multimedia and hyperlinks were used and an active learning-approach was encouraged (self-assessment, reflection, self-directed learning, problem-based learning, learner interaction, and feedback). The IT-platform underwent substantial changes and improvements during 2005–2014, reflecting the general technical progress in the IT-field. To address the need for real-life, clinical problems, case-based training scenarios were incorporated into the Modules.

Educational material

The specialists either by themselves, depending on their IT-proficiency, or, in collaboration with the IT-technical staff (SS, JD) produced lectures, usually in the format of multimedia presentations (PowerPoint with voiceover speech). Since 2006 a web-based “authors’ tool box” facilitating the production has been available for import of images and audio-files. The Module is based on one or more presentations (Table 2), Additional file 1 (e.g., scientific literature, guidelines, images, animations), links to relevant web-sites, web-based tests intended to establish performance status (MCQ, short essays) and the trainees’ web-based self-evaluation of the training quality of the Module. In addition conventional or online education materials like textbook chapters sometimes were employed by the tutors. From 2005 to 2011 the educational material was CD-based, but thereafter online log-in procedures were employed.

Educational objectives

Where a core curriculum stipulated by national consensus and based on national guidelines, were available, this was applied for each respective Educational Field, e.g., Radiotherapy, and correspondingly for each hierarchical level, i.e., Educational Sub-Field, Program and Module.

E-based interactivity

Interactivity was considered essential in encouraging the trainee’s active learning process [12], including self-assessments, self-directed learning often based on problem-solving issues and interaction with the tutor. The trainee’s activity on the technical platform was logged, and information on latest logins and lectures viewed, was available for the tutors. In order to augment self-assessment and self-directed learning a procedure called self-evaluation was used in essay exercises. After submission of the essay the trainee automatically received a complete essay report pre-fabricated by the tutor, and was then asked to re-submit the report after considering necessary changes from the original response, at the discretion of the trainee. The tutor then reviewed the re-submission and gave an individualized feedback on the essay report to the trainee. This simple self-evaluation formative measure seemed to increase the didactical value both for the trainee and the tutor, in addition to decreasing the workload of the tutor. Further, it also helped to pin down the learning objectives, facilitating the awareness of the trainee that the core goals had been achieved. But most importantly, it gave the trainee a learning opportunity and helped to the tutor’s recognition of didactic misunderstandings. E-based interactivity between fellow-trainees was based on asynchronous fora.

Physical meetings

During the first years of the E-programs a requirement for physical meetings, vis-á-vis virtual meetings, became apparent, and, we, empirically incorporated 2–3 compulsory meetings of 1½-days duration for every 4–6 months of participation in the E-program. Furthermore, it was required that the trainees completed all modules and seminars in order to get approval of the course. The rationales behind these meetings were based on practical, didactic and social aspects. First, when the E-programs were initiated 2003–2005, conventional learning methods, were still considered “the gold standard”, at least across the targeted professions and the age groups, and therefore a noticeable demand for physical meeting existed. Second, although the general acceptance of E-learning methods has increased dramatically during the last decades [13], it has been our experience that the physical gatherings are still justified since the trainees’ active learning process is stimulated. Third, needless to say, it consolidates the social networking, important since trainees often may live at considerable distance, sometimes up to thousand miles from each other. Fourth, a number of studies seem to indicate that the drop-out rates in E-programs decrease by use of physical meetings.

Technical aspects

An extended E-support for trainees and tutors was instituted from the beginning of the E-programs, and was quickly considered a prerequisite for successful implementation. A number of the trainees the first years generally demonstrated a lack of prowess and routine in IT-issues, requiring basic support beyond the supplied instruction manuals. During 2005–2011 the bulk of the programs, as previously mentioned, were CD-based requiring installation routines sometimes associated with technical incompatibility problems across systems and drivers. An IT-engineer (SS) and a highly qualified technician (JD) were at all times available, particularly important, during server-malfunctioning problems. The maintenance, development and improvements of the platform were by the engineer and an IT-assistant.

Outcomes

Learning objectives

After the completion of each Module the trainee evaluated how well the training had achieved its objective by a number of standardized questions including simple categorical rating scales. At completion of the program the trainee was also asked to rate qualitative aspects of the education material. These evaluations were continuously used by the tutors as an important feed-back mechanism on the didactic quality of the E-programs.

Other

Drop-out rates, across professions and programs, were calculated from our database as:

$${\text{drop out rate}} \% = 100 \times \frac{{{\text{number registered}} - {\text{number completed}}}}{\text{number registered}}$$

Data collection

Data were collected from our central database, from January 1, 2005 to December 31, 2014, containing information on participants’ initials, profession, zip-code, program affiliation (entry and completion date), tutor affiliation and response to questionnaire on learning objectives. In addition, information regarding the tutors’ and producers’ profession, academic education and program affiliation was retrieved from the database. A technical database was accessed analyzing the completed number of Programs and Modules, and, the technical structure of the Modules in regard to number of frames, the duration of the oral presentation, the cumulated frame time for the available lectures and use of video sequences.

Statistical methods

Normality of continuous data was analyzed by the Kolmogorov–Smirnov test and visual inspection of residual plots, and, un-paired comparisons were by a t test or by Mann–Whitney test, as appropriate. Categorical data were analyzed by Chi-squared test or Fisher’s exact test, as appropriate. Linear univariate regression analysis was by calculation of Pearson’s correlation coefficient. Statistical evaluations were by MedCalc Software (v. 12.07.0.0; Mariakerke, Belgium). Data, subjected to multiple comparisons, were corrected by the Bonferroni method, in order to decrease the likelihood of type I errors. Statistical significance was assigned at P < 0.05. Parametric and non-parametric data are presented as mean [95% confidence interval (CI)] or median (25–75% interquartile range [IQR]), respectively.

Results

Educational structure

An overview of the educational structure is presented in Table 1 including: the Educational Fields; the Educational Sub-Fields; the trainees’ health care professions; the trainees’ proficiency levels; the Program duration; corresponding ECTS-points [European Credit Transfer and Accumulation System points (one ECTS-point corresponds to 25–30 h of full-time study, equivalent to one Swedish University College point or 0.60 US College Credit Hours)]; starting year of the Program; and the number of completed Programs. The ECTS-points varied across the educational Programs from 0.7 to 30.0 ECTS-points, corresponding to a duration of full-time studies ranging between 15 to 900 h (0.4–24 weeks) per Program.

The 29 different Program classifications, contained a total of 731 Modules covering 438 themes (Fig. 1). The trainees completed a total of 490 Programs. Each Module presentation contained in median 18 frames (IQR 11; 26) with an oral presentation time of each frame of 0.68 min (IQR 0.38; 1.12), corresponding to a total time for each presentation of 12 min (7.48; 17.68). The cumulated frame time for all available lectures was 10,605 min (177 h). Video presentations were available in 30 Modules with a total number of video sequences of 64.

Demographics

Trainees

The annual number of trainees, across professions, from 2005 to 2014, are presented in Fig. 2. A total of 4693 trainees completed the Programs, while the total number of individuals were 3889, since 20.7% of the trainees participated in more than one Program. The trainees’ geographical distribution across the six Swedish health care regions, is illustrated in Fig. 3. The bimodal relationship between number of trainees (n = 3926) and duration of courses (ECTS-points) is indicated in Fig. 4. The percentage of trainees attending courses corresponding to ≤4 ECTS-points was 61.9%, and to ≥7.5 ECTS-points 38.1%. The total number of ECTS-points for the trainee-cohort was 19,438, corresponding to 534,545 full-time academic hours, equaling 324.0 standard working years (Joint Costing and Pricing Steering Group for Higher Education Institutions, U.K.) [14]. The distribution of professions of the trainees, across Educational Fields, is illustrated in Table 4.

Fig. 2
figure 2

The annual number of trainees (AN; total number = 4693) 2005–2014 across professions: registered nurses (RN; total number = 2359); radiation therapy technologists (RTT; total number = 642); medical doctors (MD; n = 759); and, social and health care assistants (SHCA; total number = 933). The number deviate from the total number of individual trainees (n = 3889), due to 20.7% of trainees’ participation in more than one Program

Fig. 3
figure 3

The six health care regions in Sweden with populations (%) as per December 31, 2014, and the distribution of number of trainees (%) in each health care region, from 2005 to 2014. Total population of Sweden, at this time point, was 9750,000, and total number of trainees was 3889 (33 subjects had their residence outside Sweden; map accessed at https://sv.wikipedia.org/wiki/Sjukv%C3%A5rdsregion; May 6, 2015)

Fig. 4
figure 4

The relationship between number of trainees (n = 3926) and the duration of courses (ECTS-points; 1 ECTS-point corresponds to 25–30 h of full-time study, equivalent to one Swedish University College point or 0.60 US College Credit Hours) from 2005 to 2014

Table 4 Distribution of professions of the trainees in regard to Educational Fields (n = 4693)

Tutors and producers

The professions of the tutors (n = 78), the producers of the educational material (n = 82) and the combined producer-tutors (n = 34), are illustrated in Table 5. The number of individuals with an academic affiliation (i.e., a PhD-degree) among MDs, RNs and ʻother professionsʼ were for tutors 9/28 (32.1%), 1/35 (2.9%) and 10/15 (66.7%), respectively. Corresponding numbers for producers were 22/29 (75.9%), 3/22 (13.6%) and 18/31 (58.1%). Across professions, comparing tutors with producers, a significantly increased proportion of individuals with academic affiliation was observed for tutors [P = 0.0001 (Chi-squared test)], most evident for MDs [75.9 vs. 32.1%; P < 0.003 (Fisher’s exact test)]. A significantly higher proportion of MDs and ʻother professionsʼ had an academic affiliation compared to RNs [P < 0.003 (Fisher’s exact test)]. The numerical higher proportion with academic affiliation for ʻother professionsʼ compared with MDs (Table 5) did not reach significance for tutors or for producers [P = 0.052 and P = 0.18, respectively (Fisher’s exact test).]

Table 5 Distribution of professions of the tutors, the producers (producing education material) and the combined producer-tutors

Outcomes

Self-reported evaluation of learning objectives

From January 1, 2008 to December 31, 2014 evaluations, based on a written questionnaire, were delivered to the trainees upon completion of the program. Questionnaires were available and evaluated from 72.1% (2642/3666) of the trainees. Fully completed questionnaires, were obtained from 96.5% [mean (CI: across programs); 94.2–98.7%] of the respondents. Below statistics are presented for the individual 10 questions (A–J; italic):

  • A. How would you rate the program as a whole?

The Programs were, across professions, overall rated as excellent by 68.6% and as good by 30.6% of the responders. SHCAs demonstrated significantly higher overall ratings of the Programs than RNs and MDs (Tables 6, 7).

Table 6 Tabular data for self-reported evaluation of learning objectives across professions (total number of trainees for each profession)
Table 7 Statistical comparisons of self-reported, questionnaire-based outcomes across trainees’ professions (Chi-squared tests)
  • B. Will you be able to use what you’ve learnt in everyday clinical practice?

The clinical applicability of the Programs was rated as “to a very high degree” of 45.2% and “to a high degree” of 49.2% of the responders. MDs demonstrated significantly lower ratings compared to RNs, while no differences compared to SHCAs were seen (Tables 6, 7).

  • C. Would you recommend the program to a colleague in a similar situation as yours?

The recommendability of the Programs was rated as “to a very high degree” of 69.5% and “to a high degree” of 26.6% of the responders. Interestingly, RNs and SHCAs rated the recommendation value of the programs significantly higher than the MDs (Tables 6, 7).

  • D. How did you experience the workload during the program?

The workload was experienced as “appropriate” to “very low” by 62.4% and as “very high” to “high” by 37.6%. Interestingly, the workload during the programs was perceived as relatively higher by the RNs and SHCAs, compared to MDs (Tables 6, 7).

  • E. Evaluate the importance of the recommended literature in the program. F. Evaluate the importance of the recommended scientific publications in the program.

The importance of the recommended literature and the scientific publications (e.g., clinical studies, chapters from textbooks) was considered of “very high importance” to “high importance” by 44.8%, and of “moderate importance” by 25.2% of the responders. Both RNs and SHCAs evaluated the relative importance of the recommended literature significantly higher than the MDs (Tables 6, 7). In contrast, there were no statistical differences across professions in regard to the value of the recommended scientific publications (Tables 6, 7).

  • G. Evaluate the importance of the lectures in the program.

Correspondingly, the importance of the recorded lectures was considered of “very high importance” to “high importance” by 89.8%, and of “moderate importance” by 6.6% of the responders. RNs and SHCAs rated the importance of the lectures significantly higher than MDs (Tables 6, 7). The lectures were rated of “very high importance” or “high importance” by 90.0% of RNs and 91.4% of SCHAs, and, by 87.1% of the MDs.

  • H. Evaluate the importance of the links in the program. I. Evaluate the importance of the exercises and the feedback in the program. J. Evaluate the importance of the physical meeting(s) in the program.

Correspondingly, the importance of the IT-links (H), the exercises and feedback from the tutor (I) and of the physical meetings (J) were considered of “very high importance” to “high importance” across professions, by 43.8, 83.9 and 91.7%, respectively. The importance of the links in the program was rated significantly higher by the SHCAs than by MDs and RNs. The importance of the exercises and the feedback was rated significantly higher among SHCAs than other professions (Tables 6, 7). Although all the professions rated the physical meeting(s) in the program of “very high importance” or “high importance”, with no statistical significance across professions, the relative ranking was highest for SHCAs (86.6%), followed by RNs (82.5%) and MDs (72.5%) (Tables 6, 7).

Drop-out rates

From January 1, 2005 to December 31, 2014, the overall drop-out rate, across Educational Fields and professions, was 499/4693 (10.6%). The drop-out rates were significantly lower (P < 0.0001; Chi-square tests) for Programs in Chemotherapy (2.9%) and Oncology (3.6%), compared to Radiotherapy (17.5%) and Symptom Therapy (18.7%). The overall drop-out rates were significantly higher (P < 0.0001; Chi-square tests) for MDs (18.7%), and, SHCAs (21.4%), compared to RNs (4.3%) and RTTs (8.6%). The lowest drop-out rate was seen for RNs (P < 0.0001; Chi-square tests), compared to other professions. Interestingly, linear regression analysis demonstrated a highly significant correlation in the annual increase in overall, relative drop-out rates [r = 0.77 (95% CI 0.27–0.94]; P = 0.0095; Pearson].

Discussion

This study presents nationwide experience with multi-professional, educational E-programs in oncology, during a 10-year period, including nearly 5000 participants. Self-reported outcomes, assessed at completion of the education revealed a high overall contentment, and perceived clinical usefulness of the E-programs, across professions. However, the descriptive design of study does not allow any firm didactic conclusions to be drawn about the E-programs, but the study is at best considered of hypothesis-generating nature, presenting valuable high-volume data for future more scientific rigorous research.

Clinical oncology is an expanding branch of medicine, and does not only include the traditional disciplines as chemotherapy, hormone therapy, immunotherapy and radiotherapy, but also a number of adjuvant specialties, such as palliative medicine, oncological pain management, psychosocial oncology, supportive care and surgical oncology. Educational measures in clinical oncology thus require both multi-disciplinary and multi-professional approaches. The present learning paradigm reflects this well, not only mirrored in the different professions and proficiency records of the trainees, but also in the professional records of the tutors and producers. The educational material presented in our E-programs benefits by the homogenous design across professions and disciplines, facilitating clinical cross-professional and cross-disciplinary collaboration. The statistical significant differences in attitudes and perceived utility of the E-programs between the professions (Table 7), are interesting, indeed, but the authors have not been able to recover any systematic analysis in the literature of differential attitudes towards E-learning, between health care professions. Although very pertinent for detailed didactic, scholarly discussions, these observations await further explorative analyses.

The main advantage of the study are inclusion of a large number of participants (n = 4693) attending a considerable number of E-programs (n = 490), with a study duration of individual educations ranging from days to months (full-time: 0.4–24 weeks). The average drop-out rate of 10.6% was considerably lower than previously described in E-programs, ranging between 25 and 60% [15, 16]. A course retention near 90% is an indirect measure of the contentment and the perceived didactic value of the E-programs and thus is a critical measure of training efficiency. While, the low drop-out rates likely are of multifactorial origin, detailed prospective studies of our E-learning programs are needed to explain these favorable results, particularly regarding the inter-professional differences observed, i.e., significantly lower drop-out rates for RNs and RTTs, compared to MDs and SHCAs. However, the interesting, but disappointing observation of a highly significant linear annual increase in overall, relative drop-out rates probably is likely explained by a rapid expansion of the programs going from 150 trainees 2005, to 700 trainees 2014 (Fig. 2), perhaps indicating an organization in “growing pains.” The hierarchical structure, considered important during the early development of the E-learning programs, was successively replaced with a more horizontal management structure, due to an increased number of programs, trainees, tutors and producers, also reflected in the wider geographical catchment area.

An important aspect in development and designing of E-learning programs is a tendency towards the use of a “richer” medium over time, progressing from text to a graphical platform, and from audio to video recordings, in an attempt to increase the effectiveness of the programs [17, 18]. One study indicated that a “richer” medium content positively correlated with the concentration efforts of the user, but ambiguous results were obtained with the perceived usefulness of the program [17]. Another study observed that the relationship between the “richness” of media choice and the effectiveness was moderated by the learning domain of the program and the learning styles of the user [18]. In the design of our E-learning programs from 2005 to 2014, transitions from audio to video recordings were not implemented to any higher degree since our empirical data indicated that no gain in didactic quality nor trainee satisfaction was obtained by use of a “richer” medium. More studies examining the influence of different media for contents of e-learning programs, regarding the principal outcomes educational efficiency, perceived usefulness, satisfaction scores and cost-efficiency measures, are clearly needed [18].

Limitations

First, an important limitation of this study is that the major outcome, self-reported evaluations only contained data from 2008 to 2015, and, further that these data only comprised 72% of the total number of trainees in this period. Although emphasis on quality assessment from the beginning was considered essential, a number of different evaluation questionnaires were used during the years, impeding the quality of data sampling. Another reason for the incomplete data collection is that a number of programs were locally managed and supervised, and thus, not infrequently, beyond our quality control. Second, this retrospective study does not include any control group: a meaningful comparison with other programs, although highly relevant indeed, is therefore not possible at this stage of research. This makes it even more difficult to draw conclusions about the potential advantages of E-learning compared to face-to-face lectures or even computer-based instructions. Comparisons between E-learning and traditional classroom teaching have been evaluated in several studies [1922], but the published experience in oncology is rather limited [23]. However, the systematic review and meta-analysis by Lahti et al. [9] did not demonstrate any statistical differences between E-learning and traditional learning groups, regarding knowledge, skills or satisfaction. In a recent randomized controlled trial, comparing live lecture, internet-based and computer-based instruction [24], it was observed that even if the interactive Internet-based instruction is a difficult and time-consuming teaching method, it was recommended to integrate the method in medical teaching. Further, in a prospective randomized controlled study [25] including physiotherapy students participating in an oncology course, comparing traditional classroom with E-learning, concluded that the use of E-learning in oncology is a feasible method of teaching. Moreover, Alfieri et al. [10] showed that the use of interactive E-learning for radiation oncology is an effective method to improve the radiologic anatomy knowledge and treatment planning skills of radiation oncology residents. Another advantage, which should be mentioned, is that E-learning may increase the cost-efficiency since the same E-learning program can be transmitted to a larger number of students [26], reducing the demand for a classroom teacher, offering more flexibility [22]. Third, only self-reported outcomes, vis-à-vis objective outcomes, are available, and obviously, subjective responses are vulnerable to a number of biases [27]. The printed questionnaire was a structured interview, administered to the trainees during a physical meeting with the tutor(s) upon completion of the program. The questionnaire was answered independently and anonymously by the trainee, and the scoring sheet was collected manually by the tutor. Acquiescence bias, referred to as “yea”-saying or “nay”-saying, i.e., a stereotype response defined as a tendency to agree with attitude statements, regardless of content [27] may have confounded the outcome. In addition, acquiescence bias is likely augmented by our use of similar categorical rating scales among items not conceptually related. Extreme responding bias, a behavior only selecting anchor values, as well as leniency bias, a behavior projecting social relationships upon the evaluation, could also contribute to the skewed distribution seen in response ratings. On the other hand, it could be argued that the self-reported evaluations were accumulated from a large number of participants, professions, programs, covering different topics and time spans, securing a broad platform of experience. The long-term impact of E-learning on clinical practice has been examined in a number of controlled randomized studies [2831]. Some of these studies observed improvement in the knowledge, skills and clinical behavior of the personnel [30, 31], while others only demonstrated slight, if any advantage over conventional learning [28, 29]. This is corroborated by two recent systematic reviews which concluded that there was insufficient evidence regarding the effectiveness of E-learning on healthcare professional behavior or patient outcomes [9, 32]. Fourth, the learning experiences have not yet been tested in a clinical benchmarking scenario against conventional on-location programs.

Conclusions

This descriptive study presents high-volume data and covers 10-years of nationwide multi-professional and multidisciplinary experience with oncological E-learning. The consistent high fulfillment ratings in learning objectives, the low drop-out rates and the wide geographical catchment area suggest that E-learning programs with contemporary techniques are feasible and complementary pathways to improve pre- and post-graduate training in oncology. Further, the contemporary demands placed on the health workforce regarding a professional responsibility of maintaining the medical competence in practice, E-learning may play an important role in overcoming this challenge [32].

Finally, these hypothesis-generating data demonstrate that our educational paradigm has been well received across disciplines, professions and proficiency levels. However, it is also evident that prospective, high-volume comparative studies, ascertaining the objective didactic value of the E-programs, are needed.

Abbreviations

APRN:

advanced practice registered nurses

BAS:

basic level

CD:

compact disc

ECTS:

European Credit Transfer and Accumulation System

EXP:

experienced level

HCA:

health care assistants

HVE:

higher vocational education

IGRT:

image-guided radiotherapy

IQR:

inter-quartile range

IT:

information technology

MCQ:

multiple choice questions

MD:

medical doctors

RES:

residents (MD)

RN:

registered nurses

RT:

radiotherapy

RTT:

radiation therapy technologists

SHCA:

social and health care assistants

SPEC:

specialists (MD)

VC:

vocational clinical training

References

  1. Lewis KO, Cidon MJ, Seto TL, Chen H, Mahan JD. Leveraging e-learning in medical education. Curr Probl Pediatr Adolesc Health Care. 2014;44:150–63.

    Article  PubMed  Google Scholar 

  2. Butcher K, Bamford R, Burke D. Innovation in e-learning: learning for all. Ecancermedicalscience. 2014;8:467.

    CAS  PubMed  PubMed Central  Google Scholar 

  3. Suemoto CK, Ismail S, Correa PC, Khawaja F, Jerves T, Pesantez L, et al. Five-year review of an international clinical research-training program. Adv Med Educ Pract. 2015;6:249–57.

    PubMed  PubMed Central  Google Scholar 

  4. Kellogg S. Distance learning: online education. Nature. 2011;478:417–8.

    Article  CAS  PubMed  Google Scholar 

  5. National Center for Education Statistics: fast facts: distance learning 2016. https://nces.ed.gov/fastfacts/display.asp?id=80.

  6. Allen IE, Seaman J. Making the grade. Online education in the United States, 2006. Sloan Consortium and Babson Survey Research Group. http://sloanconsortium.org/publications/survey/pdf/Making_the_Grade.pdf.

  7. Cendan J, Lok B. The use of virtual patients in medical school curricula. Adv Physiol Educ. 2012;36:48–53.

    Article  PubMed  PubMed Central  Google Scholar 

  8. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300:1181–96.

    Article  CAS  PubMed  Google Scholar 

  9. Lahti M, Hatonen H, Valimaki M. Impact of e-learning on nurses’ and student nurses knowledge, skills, and satisfaction: a systematic review and meta-analysis. Int J Nurs Stud. 2014;51:136–49.

    Article  PubMed  Google Scholar 

  10. Alfieri J, Portelance L, Souhami L, Steinert Y, McLeod P, Gallant F, et al. Development and impact evaluation of an e-learning radiation oncology module. Int J Radiat Oncol Biol Phys. 2012;82:e573–80.

    Article  PubMed  Google Scholar 

  11. Swedish Data Protection Authority: The personal data act 2016. http://www.datainspektionen.se/in-english/legislation/the-personal-data-act/.

  12. Cook DA, Dupras DM. A practical guide to developing effective web-based learning. J Gen Intern Med. 2004;19:698–707.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Felder E, Fauler M, Geiler S. Introducing e-learning/teaching in a physiology course for medical students: acceptance by students and subjective effect on learning. Adv Physiol Educ. 2013;37:337–42.

    Article  CAS  PubMed  Google Scholar 

  14. Higher Education Funding Council for England. Joint Costing and Pricing Steering Group: Transparent Approach to Costing (TRAC) Guidance http://www.jcpsg.ac.uk/guidance/.

  15. Levy Y. Comparing dropouts and persistence in e-learning courses. Comput Educ. 2007;48:185–204.

    Article  Google Scholar 

  16. Nistor N, Neubauer K. From participation to dropout: quantitative participation patterns in online university courses. Comput Educ. 2010;55:663–72.

    Article  Google Scholar 

  17. Liu SH, Liao HL, Pratt JA. Impact of media richness and flow on e-learning technology acceptance. Comput Educ. 2009;52:599–607.

    Article  Google Scholar 

  18. Sahasrabudhe V, Kanungo S. Appropriate media choice for e-learning effectiveness: role of learning domain and learning style. Comput Educ. 2014;76:237–49.

    Article  Google Scholar 

  19. Bains M, Reynolds PA, McDonald F, Sherriff M. Effectiveness and acceptability of face-to-face, blended and e-learning: a randomised trial of orthodontic undergraduates. Eur J Dent Educ. 2011;15:110–7.

    Article  CAS  PubMed  Google Scholar 

  20. Al-Riyami S, Moles DR, Leeson R, Cunningham SJ. Comparison of the instructional efficacy of an internet-based temporomandibular joint (TMJ) tutorial with a traditional seminar. Br Dent J. 2010;209:571–6.

    Article  CAS  PubMed  Google Scholar 

  21. Padalino Y, Peres HH. E-learning: a comparative study for knowledge apprehension among nurses. Rev Lat Am Enfermagem. 2007;15:397–403.

    Article  PubMed  Google Scholar 

  22. Horiuchi S, Yaju Y, Koyo M, Sakyo Y, Nakayama K. Evaluation of a web-based graduate continuing nursing education program in Japan: a randomized controlled trial. Nurse Educ Today. 2009;29:140–9.

    Article  PubMed  Google Scholar 

  23. Frosch DL, Kaplan RM, Felitti VJ. A randomized controlled trial comparing internet and video to facilitate patient education for men considering the prostate specific antigen test. J Gen Intern Med. 2003;18:781–7.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Mojtahedzadeh R, Mohammadi A, Emami AH, Rahmani S. Comparing live lecture, internet-based & computer-based instruction: a randomized controlled trial. Med J Islam Repub Iran. 2014;28:136.

    PubMed  PubMed Central  Google Scholar 

  25. da Costa Vieira RA, Lopes AH, Sarri AJ, Benedetti ZC, de Oliveira CZ. Oncology E-learning for undergraduate. a prospective randomized controlled trial. J Cancer Educ. 2016. doi:10.1007/s13187-015-0979-9.

    PubMed  Google Scholar 

  26. Thompson D, Wolf AM. The medical-care cost burden of obesity. Obes Rev. 2001;2:189–97.

    Article  CAS  PubMed  Google Scholar 

  27. Podsakoff PM, MacKenzie SB, Lee JY, Podsakoff NP. Common method biases in behavioral research: a critical review of the literature and recommended remedies. J Appl Psychol. 2003;88:879–903.

    Article  PubMed  Google Scholar 

  28. Kontio R, Lahti M, Pitkanen A, Joffe G, Putkonen H, Hatonen H, et al. Impact of eLearning course on nurses’ professional competence in seclusion and restraint practices: a randomized controlled study (ISRCTN32869544). J Psychiatr Ment Health Nurs. 2011;18:813–21.

    Article  CAS  PubMed  Google Scholar 

  29. Kontio R, Hatonen H, Joffe G, Pitkanen A, Lahti M, Valimaki M. Impact of eLearning course on nurses’ professional competence in seclusion and restraint practices: 9-month follow-up results of a randomized controlled study (ISRCTN32869544). J Psychiatr Ment Health Nurs. 2013;20:411–8.

    Article  CAS  PubMed  Google Scholar 

  30. Lahti M, Kontio R, Pitkanen A, Valimaki M. Knowledge transfer from an e-learning course to clinical practice. Nurse Educ Today. 2014;34:842–7.

    Article  PubMed  Google Scholar 

  31. Lahti ME, Kontio RM, Valimaki M. Impact of an e-Learning Course on Clinical Practice in Psychiatric Hospitals: Nurse Managers’ Views. Perspect Psychiatr Care. 2016;52:40–8.

    Article  PubMed  Google Scholar 

  32. Sinclair PM, Kable A, Levett-Jones T, Booth D. The effectiveness of Internet-based e-learning on clinician behaviour and patient outcomes: a systematic review. Int J Nurs Stud. 2016;57:70–81.

    Article  PubMed  Google Scholar 

Download references

Authors’ contributions

Contributed to conception and design of the study (JD, MW, SS, PF, EK), data collection (JD, SS), data processing (JD, SS, MW), manuscript writing (JD, MW) and final approval of the manuscript (JD, SS, PF, EK, MW). All authors read and approved the final manuscript.

Acknowledgements

We would like to express our gratitude to Eriksson AD and Dahl AK, Regional Cancer Center, Gothenburg, SWEDEN for help with regional data acquisition. Funding was provided by Department of Clinical Sciences, Lund University, Sweden.

Competing interests

JD and SS are principal shareholders of LäraNära AB. SS is an employee of LäraNära AB. MUW has received salaries from LäraNära AB for development and production of modules.

Availability of data and materials

The raw data generated/analysed during the study can be found in the Additional file 1.

Ethics approval and consent to participate

The E-programs and the associated data-bases comply with Swedish laws and regulations stipulated in the Personal Data Act (1998:204), aiming to prevent the violation of personal integrity in the processing of personal data [11]. The study therefore did not need formal approval of the committee on health research ethics.

Funding

The study is funded solely by departmental resources.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mads U. Werner.

Additional file

13104_2017_2372_MOESM1_ESM.xlsx

Additional file 1. Individual raw data on groups, chronology, the number of participants, completion rates, evaluation responses, geographical distribution of trainees, ECTS-points and teacher demographics.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Degerfält, J., Sjöstedt, S., Fransson, P. et al. E-learning programs in oncology: a nationwide experience from 2005 to 2014. BMC Res Notes 10, 39 (2017). https://doi.org/10.1186/s13104-017-2372-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13104-017-2372-8

Keywords