Skip to main content

SonoGames: sounds of the right kind introducing gamification into radiology training

Abstract

Background

Radiology as compared to other fields of medicine has lagged, in incorporating modern training modalities such as gamification and simulation into its teaching curriculum.

Objective

This study aims to evaluate effectiveness of simulation-based teaching in collaboration with gamification. Bandura’s conception of self-efficacy was used to provide qualitative assessment of participants’ learning process through training event. Modified competitive game-based teaching methodology was utilized in an experimental study conducted for radiology residents. Workshop was divided into two sessions, first being three interactive didactic lectures followed by three competitive rounds. All participants were required to fill pre and post-self-efficacy questionnaire along with an activity evaluation form.

Results

Significant self-efficacy scores were calculated for simulation-based stations of knowledge assessment and hands-on stations. Whereas significant association was also found between gender and knowledge assessment in communication skill (0.054), Professionalism (0.004), and general knowledge (0.018). Similarly, noteworthy correlation was found between gender and all hands-on skills. In conclusion, study reported an overall increase in knowledge of post-test scores compared to pre-test scores due to use of gamification in combination with simulation-based teaching which shows a positive role in clinical training. However, further consideration is needed to improve process of integrating simulation in clinical training of participants.

"By sticking it out through tough times, people emerge from adversity with a stronger sense of efficacy."

–Albert Bandura

Encyclopedia of Human Behavior, 1994

Introduction

Medical education is dissemination of knowledge to healthcare professionals regarding real world scenarios that they might face in their respective fields [1]. Practical training brings with itself some dilemmas. One such conundrum is safety and wellbeing of patients, while providing optimal care. Other side of the coin requires repeated exposure to better understand and respond to clinical situations [2]. Another factors is the necessity of doctors to be well versed with teamwork and good communication skills piled on to basic need of knowledge and skill [3, 4].

It is vital that medical education cannot and should not lag compared to other fields of learning, thus incorporation of simulation-based training (SBT) in clinical learning is compulsory. Moreover, simulation is a technique to help either replace and/or augment learning experience that is gained from real situations. SBT is immersive in characteristics, aimed to draw participant into a task or setting as they were experiencing it in an actual setting [5, 6].

Clinical SBT is an ideal solution to problem posed in medical education regarding patient safety versus leering and exposure of doctor, with ability to diminish risk associated with patient while providing a life-like scenario. Techniques used in SBT are used for training purposes and evaluation of competencies [7, 8]. It may seem novel, however, SBT has been majorly used in aviation and military, whereas in medicine it has been used in anesthesiology [2, 5, 6]. Impact of simulation on how medicine is taught has already led to changes in curriculums for healthcare providers, where participants have opportunity to practice, develop and master skills, via a process of try and repeat [9,10,11]. SBT also allows one to refresh their skills or to practice unique and uncommon clinical presentations and be prepared for when they arise without putting patient at risk. This depiction of conditions from textbooks adds a layer of intrigue to scenario while developing heightened levels of enthusiasm. There are many educationists and pioneers who believe that SBT increases efficiency skill and knowledge [12,13,14,15].

Use of simulation as an advent of teaching and training in radiology has been a relevant factor dating as long back as case conference which is a part of radiology training. This method introduced two distinct types of simulation which were visual or auditory. Images are displayed to participants; they review and assess images then work towards a differential diagnosis and treatment. It is identical to what radiologist would experience in a routine day, thus adding high fidelity to exercise. With evolution of technology, mannequins were used as simulators to augment training process [15,16,17,18].

In medicine and radiology where sifting through images and reports can numb individual, resulting in a lack of concentration, disconnection with knowledge that is being disseminated. Hence, it was identified that a non-conventional method of teaching (gamification) had potential to be effective for students and residents [19]. Many institutes also implemented a game-based (GB) educational system, which was enthusiastically received by participants, showing increased levels of understanding of ultrasound in clinical practice while also increasing their capabilities [20].

Main obstacle in simulation-based education (SBE) comes with evaluation of its outcomes, along with problem of assessing effectiveness. Hence, Bandura proposed method of assessing self-efficacy (SE) [21]. World of education has also seen adrift from using routine teaching methods to more hands-on and interactive teaching modalities with incorporation of entertaining way to learn, such as competitions being held and conversion of lecture room into a game room, having students both enjoy and become more engaged in their learning [22].

Centre for Innovation in Medical Education (CIME) at The Aga Khan University Hospital (AKUH) has proposed to incorporate teaching modality of gamification in a fun and interactive way, by holding first ever Sonogames (SG) in Pakistan, where radiology residents test their knowledge against each other while making whole processes enjoyable.

Study implication and objective

In Pakistan, GB simulation training is not widely available and prevalent. This study provided a motive for Healthcare institutions to work on improving the understanding and integrating SBT programs in all specialties of health science. Objective of this study is to assess Radiology residents’ knowledge, hands-on skills, and integration of knowledge into clinical decision making. Furthermore, it aims to evaluate SE of participants as a measure for competency using GB simulation training program.

Main text

Methodology

Study design, population and setting

An experimental study was conducted to assess perception, technical skills, knowledge, and SE of participants of SG. Target population was College of Physicians and Surgeons Pakistan (CPSP) registered radiology residents from four hospitals of Karachi. SG was conducted at CIME, AKU. Exemption was taken from institutional ethics review committee.

Sampling method and sample size

Non-Probability purposive sampling was used with sample size of 30 residents who participated in SG by assuming 50% prevalence rate of SE with 95% confidence interval.

Inclusion criteria
  • Radiology residents registered with CPSP, who had yet to pass any part of their FCPS Part II examination.

  • Participants who registered for workshop.

Exclusion criteria
  • Participants who didn’t attend lecture, all three rounds including briefing, simulation, and debriefing sessions.

Self-efficacy and potential implications

SE is the belief we have in our abilities, to meet challenges and complete a task successfully. [21]. Tool used to evaluate SE is a pre-and post-training questionnaire using Scale of 0–100 [23]. Both questionnaires had similar questions and response options. Teaching design allowed participants to be put through rigorous sessions of knowledge recall in pressure situations and time-sensitive environments.

Data collection and analysis

Written consent was obtained from all 30 participants. They were instructed to fill out pre-training questionnaire assessing their expertise and knowledge before practicing. Questionnaire was validated by faculty of radiology, which also obtained psychometric evaluation on their discretion. Groups were then subsequently debriefed about their performances.

After completing the event, participants were asked to fill the post SEQ. This helped them to reflect on knowledge they had gained so that they could compare their SE before and after session by filling in post-training questionnaire portion along with an activity evaluation form.

Data was entered in SPSS (Statistical Package for Social Sciences) version 19.0. Frequency and percentages were reported for quantitative variables, whereas qualitative variables were reported as statements. Independent and paired T-Test were used to find statistical significance in pre and post self-efficacy scores.

Planning and preparation

CIME in collaboration with Department of radiology arranged SG. Majority of information was collected from ‘SonoGames: effect of an innovative competitive game on education, perception, and use of point‐of‐care ultrasound’ [20] and ‘SonoGames: an innovative approach to emergency medicine resident ultrasound education’ [24].

A team of five Radiologists from department of radiology at AKUH were selected to act as organizer, coordinator, moderator, and judges. Team developed SG by dividing into three interactive lectures following three rounds conducted over four hours. All competition questions and simulation scenarios were written and reviewed by team. Organizing team of radiologists were ably supported by technical team of CIME for smooth working of simulators. Whereas, media and marketing team promoted event.

Competition structure

There were three rounds carried out on same day to remove chance of bias for teams getting more time to study up and have an unfair advantage. Teams were challenged in timed trials made up of unique and innovative GB rounds to test their skills and knowledge.

At the end, a grand debriefing and feedback session of all participants was conducted. Winning team was awarded medals whereas all participants were given 4.00 Accreditation Council for Continuing Medical Education (AACME) credit hours’ certificate.

Results

Demographic details

Thirty residents took part in this workshop, out of which 17 were female and 13 were male. Eight participants were from 1st and 2nd year residency program. 22 participants were from 2nd and 3rd year residency program.

SE score in relation to knowledge assessment and hands-on station

Significant association was found among all SE questions which highlights that SBT along with gamification has a positive influence on participants SE. Pre and post scores in medical knowledge showed significant change with p-value of < 0.001. Scores of reading an ultrasound images, and making a provisional diagnosis were also significant with a p-value of < 0.001 for both. However, difference in pre and post scores for reading an X-ray (13) and making a provisional diagnosis (13) was less than that of scores in medical knowledge (24).

Second part of questionnaire included questions on SE in relation to activities performed during hands-on stations. A significant association of p-value < 0.001 was found in all variables of self-efficacy. Highest difference in SE score was found in performance of hip ultrasound on a neonate (34) compared to the score seen in making a final diagnosis which had least difference (16). Details can be found in Table 1.

Table 1 Self-efficacy score in relation to knowledge and hands-on assessment

SE score of knowledge assessment and hands-on skills in relation to gender

Considering variation in genders with regards to response of SE pre and post questionnaire, parameter of medical knowledge between males and females showed SE mean difference scores of 26.1 and 22.4 respectively. While second parameter measured in questionnaire of practice-based learning and improvement gave mean values of 26.9 for males and 19.41 for females. Third variable titled interpersonal and communication skills, gave a mean of 20.7 in males and 12.35 in females with a significance p-value of 0.054 respectively. Furthermore, with regards to professionalism where mean values were 20.8 in men and 7.65 in women with a significance p-value of 0.004. Last section was of general knowledge, where mean scores were 20.13 for men and 10.88 for women with a significance p-value of 0.018.

In Fast chase skills station, SE mean difference of male participants was reported to be 27.18 and 18.33 of females with a significance of 0.024. Blind partner skill station reported male score as 23.91 and female score as 10.83 with 0.001 of significance. In communication skills, self-efficacy mean difference of males was 26.54 and 12.65 of female participants with a significance of 0.001. Details are mentioned in Table 2.

Table 2 Self-efficacy score of knowledge assessment and hands-on skills in relation to gender
Activity assessment of feedback

Seventeen of participants stated that interactive tutorials were informative whereas 14 participants said that simulation activities were very challenging. Further details are reported in Fig. 1

Fig. 1
figure 1

Activity assessment of feedback

All participants reported that program met their expectations and sessions were applicable to their job also that they would recommend this program to others. Further details are reported in Additional file 1.

Discussion

Our results were noteworthy as we found that participation in SG had a positive effect on perception and understandings of residents across knowledge and clinical skills. 73% of participants of our study reported that SG helped them to acquire new knowledge while similar (80% of participants) was quoted by a study that recruited residents of Emergence Medicine (EM) [20]. Pilot study conducted on EM interns also reported similar results to ours with 81% of the participants stating an improvement in ultrasound knowledge [25].

GB simulation activities conducted in SG were rated as excellent (53%) and very good [40%] of participants. In EM residents study, 90% of participants said that hands-on games were an effective educational modality [20]. Study conducted at Stanford University states that activities like SG are beneficial as a training platform for those who have just started their residency [25].

Our study also helped residents to master art of communication. Significant association between communication, professionalism, and SE scores of all participants was reported. SG contributed in improving communication skills of EM interns of pilot study. They further stated that EM needs efficient communication skills and this approach of teaching helped them progressing through training [25]. Study conducted in Boston registers similar findings where radiology residents and fellows reported an increase in post communication mean score. Similar study also stated that participants gave a good score to quality of lecture whereas 56.6% of residents in our study said that quality of tutorials was excellent [17].

In our study, post mean score of knowledge assessment is higher than pre mean score in all participants similar to Chen et al. where an increase in post test scores by an average of 10 points was reported [26].

Positive feedback was given by all participants. “Event was good, and I thoroughly enjoyed this approach of learning” said one female resident. A participant who worked for a private hospital said “This idea is novel for us as we do not have access of learning through simulation. This course has helped me in increasing my ultrasound skills”.

In conclusion, study reported an overall increase in knowledge of post-test scores compared to pre-test scores. Use of gamification in combination with SBT shows a positive role in clinical training. However, this field needs further consideration to better the process of integrating simulation in clinical training of participants.

Limitations

  • Confined to data of one specialty.

  • Not all participants were familiar with SBT and simulators.

  • Number of participants was low.

  • Results cannot be generalized for targeted population.

  • Measurements of changes in the variables were obtained soon after event.

Availability of data and materials

The datasets used and/or analysed during the current study available from the corresponding author on reasonable request.

Abbreviations

SBT:

Simulation based training

SBE:

Simulation based education

SE:

Self-efficacy

SG:

Sono–games

CIME:

Centre for innovation in medical education

AKUH:

Aga Khan University Hospital

EM:

Emergency medicine

PMDC:

Pakistan medical and dental council

CPSP:

College of Physicians and Surgeons Pakistan

CHK/DUHS:

Civil Hospital Karachi/DOW University of Health and Science

SUIT:

Sindh institute of urology and transplantation

LNH:

Liaquat national hospital

AACME:

Accreditation council for continuing medical education

References

  1. Quintero GA. Medical education and the healthcare system-why does the curriculum need to be reformed? BMC Med. 2014;12(1):1–4.

    Article  Google Scholar 

  2. Jha AK, Duncan BW, Bates DW. Simulator-based training and patient safety. Making health care safer: a critical analysis of patient safety practices. 2001, p. 511.

  3. O’Leary KJ, Ritter CD, Wheeler H, Szekendi MK, Brinton TS, Williams MV. Teamwork on inpatient medical units: assessing attitudes and barriers. Qual Saf Health Care. 2010;19(2):117–21.

    Article  CAS  Google Scholar 

  4. Scherer YK, Myers J, O’Connor TD, Haskins M. Interprofessional simulation to foster collaboration between nursing and medical students. Clin Simul Nurs. 2013;9(11):e497–505.

    Article  Google Scholar 

  5. Gaba D. The human work environment and anesthesia simulators. Anesthesia, 5th edn. 1999. p. 2613–68.

  6. Gaba DM. The future vision of simulation in health care. BMJ Qual Saf. 2004;13(suppl 1):i2-10.

    Article  Google Scholar 

  7. Lateef F. Simulation-based learning: just like the real thing. J Emerg Trauma Shock. 2010;3(4):348.

    Article  Google Scholar 

  8. Al-Elq AH. Simulation-based medical teaching and learning. J Fam Community Med. 2010;17(1):35.

    Article  Google Scholar 

  9. Lateef F. What’s new in emergencies, trauma, and shock? Role of simulation and ultrasound in acute care. J Emerg Trauma Shock. 2008;1(1):3.

    Article  Google Scholar 

  10. Shapiro M, Morey J, Small S, Langford V, Kaylor C, Jagminas L, et al. Simulation based teamwork training for emergency department staff: does it improve clinical team performance when added to an existing didactic teamwork curriculum? BMJ Qual Saf. 2004;13(6):417–21.

    Article  CAS  Google Scholar 

  11. Gaba DM, Howard SK, Fish KJ, Smith BE, Sowb YA. Simulation-based training in anesthesia crisis resource management (ACRM): a decade of experience. Simul Gaming. 2001;32(2):175–93.

    Article  Google Scholar 

  12. Grantcharov TP, Kristiansen VB, Bendix J, Bardram L, Rosenberg J, Funch-Jensen P. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg. 2004;91(2):146–50.

    Article  CAS  Google Scholar 

  13. Gaba DM, Howard SK, Flanagan B, Smith BE, Fish KJ, Botney R. Assessment of clinical performance during simulated crises using both technical and behavioral ratings. Anesthesiology. 1998;89(1):8–18.

    Article  CAS  Google Scholar 

  14. Wang CL, Schopp JG, Petscavage JM, Paladin AM, Richardson ML, Bush WH. Prospective randomized comparison of standard didactic lecture versus high-fidelity simulation for radiology resident contrast reaction management training. Am J Roentgenol. 2011;196(6):1288–95.

    Article  Google Scholar 

  15. Gaca AM, Frush DP, Hohenhaus SM, Luo X, Ancarana A, Pickles A, et al. Enhancing pediatric safety: using simulation to assess radiology resident preparedness for anaphylaxis from intravenous contrast media. Radiology. 2007;245(1):236–44.

    Article  Google Scholar 

  16. Tubbs RJ, Murphy B, Mainiero MB, Shapiro M, Kobayashi L, Lindquist D, et al. High-fidelity medical simulation as an assessment tool for radiology residents’ acute contrast reaction management skills. J Am Coll Radiol. 2009;6(8):582–7.

    Article  Google Scholar 

  17. Sica G, Barron D, Blum R, Frenna T, Raemer D. Computerized realistic simulation: a teaching module for crisis management in radiology. AJR Am J Roentgenol. 1999;172(2):301–4.

    Article  CAS  Google Scholar 

  18. Tofil NM, White ML, Grant M, Zinkan JL, Patel B, Jenkins L, et al. Severe contrast reaction emergencies: high-fidelity simulation training for radiology residents and technologists in a children’s hospital. Acad Radiol. 2010;17(7):934–40.

    Article  Google Scholar 

  19. Hansen E, Pilarski A, Plasner S, Cheaito MA, Epter M, Kazzi A. The osteopathic applicant. J Emerg Med. 2019;56(4):e65–9.

    Article  Google Scholar 

  20. Liteplo AS, Carmody K, Fields MJ, Liu RB, Lewiss RE. SonoGames: effect of an innovative competitive game on the education, perception, and use of point-of-care ultrasound. J Ultrasound Med. 2018;37(11):2491–6.

    Article  Google Scholar 

  21. Akhtar M. What is self-efficacy? Bandura’s 4 sources of efficacy beliefs. Posit Psychol UK. 2008.

  22. Aldemir T, Celik B, Kaplan G. A qualitative investigation of student perceptions of game elements in a gamified course. Comput Hum Behav. 2018;78:235–54.

    Article  Google Scholar 

  23. Caruso R, Pittella F, Zaghini F, Fida R, Sili A. Development and validation of the nursing profession self-efficacy scale. Int Nurs Rev. 2016;63(3):455–64.

    Article  CAS  Google Scholar 

  24. Lewiss RE, Hayden GE, Murray A, Liu YT, Panebianco N, Liteplo AS. SonoGames: an innovative approach to emergency medicine resident ultrasound education. J Ultrasound Med. 2014;33(10):1843–9.

    Article  Google Scholar 

  25. Lobo V, Stromberg AQ, Rosston P. The sound games: introducing gamification into Stanford’s orientation on emergency ultrasound. Cureus. 2017. https://doi.org/10.7759/cureus.1699.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Chen W-L, Hsu C-P, Wu P-H, Chen J-H, Huang C-C, Chung J-Y. Comprehensive residency-based point-of-care ultrasound training program increases ultrasound utilization in the emergency department. Medicine. 2020. https://doi.org/10.1097/MD.0000000000024644.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

We acknowledge the faculty of the Radiology department for their dedicated time, engagement and delivery of informative sessions. The staff of CIME for providing us with support throughout this project, including technical support, logistics, stationary, media coverage, hard work and effort to organize and execute SG successfully. The CIME is the most advanced healthcare teaching and learning facility in Pakistan and plays a major role in innovating medical education at the undergraduate and postgraduate level.

Funding

No funding was received.

Author information

Authors and Affiliations

Authors

Contributions

The workshop was conducted in CIME with Department of Radiology. MFA facilitated the workshop with NN and initiated the research. MFA reviewed the literature along with FK and data was gathered by NNM and GN. FK, analyzed computed the results. All the writers wrote different sections of the manuscript plus edited it for submission. NN and CD critically reviewed the article. NN and CD provided intellectual guidance and approved the final manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Farah Khalid.

Ethics declarations

Ethics approval and consent to participate

The study conducted was an experimental retrospective study with approval from the Ethics Review Committee in The Aga Khan University. All participants gave their written consent to participate in the study. The study was conducted in accordance to the ethical standards described in the 1964 Declaration of Helsinki and its later amendments.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Additional Information such as preparation of Sono Games, Self-Efficacy Questionnaire used for the activity and data for analysis including charts and figures are reported in additional file.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ali, M.F., Nadeem, N., Khalid, F. et al. SonoGames: sounds of the right kind introducing gamification into radiology training. BMC Res Notes 14, 341 (2021). https://doi.org/10.1186/s13104-021-05761-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13104-021-05761-y

Keywords