Skip to main content

E-learning in graduate medical education: survey of residency program directors

Abstract

Background

E-learning—the use of Internet technologies to enhance knowledge and performance—has become a widely accepted instructional approach. Little is known about the current use of e-learning in postgraduate medical education. To determine utilization of e-learning by United States internal medicine residency programs, program director (PD) perceptions of e-learning, and associations between e-learning use and residency program characteristics.

Methods

We conducted a national survey in collaboration with the Association of Program Directors in Internal Medicine of all United States internal medicine residency programs.

Results

Of the 368 PDs, 214 (58.2%) completed the e-learning survey. Use of synchronous e-learning at least sometimes, somewhat often, or very often was reported by 85 (39.7%); 153 programs (71.5%) use asynchronous e-learning at least sometimes, somewhat often, or very often. Most programs (168; 79%) do not have a budget to integrate e-learning. Mean (SD) scores for the PD perceptions of e-learning ranged from 3.01 (0.94) to 3.86 (0.72) on a 5-point scale. The odds of synchronous e-learning use were higher in programs with a budget for its implementation (odds ratio, 3.0 [95% CI, 1.04–8.7]; P = .04).

Conclusions

Residency programs could be better resourced to integrate e-learning technologies. Asynchronous e-learning was used more than synchronous, which may be to accommodate busy resident schedules and duty-hour restrictions. PD perceptions of e-learning are relatively moderate and future research should determine whether PD reluctance to adopt e-learning is based on unawareness of the evidence, perceptions that e-learning is expensive, or judgments about value versus effectiveness.

Peer Review reports

Background

Computer-based technologies have permeated postgraduate medical education [1,2,3,4]. Electronic learning, or e-learning—the use of Internet technologies to enhance knowledge and performance —is a conventional instructional approach. E-learning has been shown to be equally effective as other educational approaches for acquisition of knowledge, skills, and behaviors [1, 5]. The effectiveness of e-learning varies widely across different courses [6, 7]. Potential advantages of e-learning include flexibility, control over learning activities, and data collection for assessment, course improvement, and adaptive instruction [4].

E-learning may be particularly useful for graduate medical education. Resident shifts and work-hour restrictions often interfere with daily core didactic lecture attendance. One solution involves videotaping lectures for future viewing. However, this approach does not address differences in learners’ needs, which vary based on experience, interests, and learning speed. The practical shortcomings of traditional live lectures may be addressed by the use of e-learning tools.

One uncertainty about e-learning is its true cost [8]. Although evidence suggests net savings compared with face-to-face education [9], other research indicates high development and maintenance costs for e-learning activities [10,11,12]. Although some have claimed a “Net Generation” is requesting greater use of technologies, research suggests no overwhelming demand for e-learning over traditional approaches [13,14,15].

The implementation of e-learning varies widely across residency programs. Surveys of emergency medicine [16, 17] and surgery residents [18] reveal that these trainees commonly use podcasts, online textbooks, and Internet searches. However, little is known about the determinants and frequency of e-learning utilization in graduate medical education. Program Directors (PDs) are uniquely positioned to influence utilization of e-learning residency programs; yet, we are unaware of studies to examine how PDs perceive e-learning. We conducted a national survey, in collaboration with the Association of Program Directors in Internal Medicine (APDIM), to answer the following questions:

  1. 1.

    What is the utilization of synchronous (live, real-time, simultaneous) and asynchronous (virtual where learners are responsible for self-pacing and instruction) e-learning by internal medicine residency programs in the United States?

  2. 2.

    How do PDs perceive e-learning with respect to the educational outcome levels of reaction, learning, behavior, and results? [19]

  3. 3.

    What associations exist between e-learning use and residency program characteristics?

Methods

Study setting and participants

The APDIM administers an annual survey to United States residency programs. The 2015 survey was administered to 368 programs (92.9% of the 396 United States internal medicine residency programs) in August 2015. This anonymous survey was sent by the Mayo Clinic Survey Research Center via an email link to the residency program director using Qualtrics (Qualtrics LLC, Provo, UT) software. The survey collected demographic data, six perceptions of e-learning questions, and questions regarding e-learning utilization. Non-responders were identified by the Survey Research Center, but their identities were not known to the study authors. The Mayo Clinic institutional review board approved this study (Identification number: 08–007125).

Data collection

The APDIM survey identifies characteristics of the PD (age, gender, academic rank, specialty) and program (number of hospital beds and percentage of training positions filled by international medical graduates). To provide a common understanding, the survey instructions defined “e-learning” as use of Internet technologies to enhance knowledge and performance, “synchronous e-learning” as live and real-time, where all residents receive information simultaneously, and “asynchronous e-learning” as virtual and not simultaneous, where residents are responsible for pacing and self-instruction. PDs were queried on the frequency of using synchronous and asynchronous teaching methods (scale: “never,” “very rarely,” “somewhat rarely,” “sometimes,” “somewhat often,” and “very often”). The e-learning survey collected information on types of electronic media used, resources for e-learning, provision of faculty development, and program e-learning budget. PDs indicated their perception of the effectiveness of e-learning for improving the outcomes of reaction, learning, behavior, and results. Survey responses were coupled with publicly available data including program type, region, American Board of Internal Medicine (ABIM) 2012–2014 3-year rolling board pass rates, number of Accreditation Council for Graduate Medical Education (ACGME)-approved training positions, and PD tenure [20,21,22,23].

Item content regarding PDs’ perceptions of e-learning was derived from literature on educational outcomes [19, 24,25,26]. Six items were created to reflect Kirkpatrick’s [19] outcome levels of reaction, learning (further subdivided using Bloom’s [24] taxonomy: cognitive/knowledge, psychomotor/knowledge, and affective/attitudes), behavior, and results (Additional file 1: Table S1). These items were structured on 5-point scales (1, strongly disagree; 5, strongly agree).

Data analysis

To assess representativeness of the programs sampled, characteristics of survey responders were compared with survey nonresponders for 5 publicly available variables using the Fisher exact test or Welch t test.

Factor analysis was performed on PDs’ perceptions of e-learning items. Factors were extracted using minimal proportion criteria and confirmed with the scree plot. Item loadings of 0.50 or higher were retained. Internal consistency reliability was calculated using the Cronbach α, with α > 0.7 considered acceptable [27].

Associations between PD characteristics and PD perceptions of e-learning scores were assessed using a multiple analysis of variance (ANOVA) model. Military-based programs were excluded from the associations analysis because of few respondents (4 of 214) but were included in other analyses. The numeric-valued variables of age and tenure were assessed separately for a possible linear relationship with PD perception of e-learning using simple linear regression models and were then dichotomized using their medians for inclusion in the ANOVA model. A multiple logistic regression model was used to generate odds ratios (ORs) and test associations between program characteristics and the regular use (“somewhat often” or “very often”) of synchronous and asynchronous e-learning. For the continuous predictors of ACGME-approved positions, ABIM 2012–2014 3-year rolling pass rate, percentage of positions filled by international medical graduates, number of hospital beds, and mean PD perception of e-learning score, the adequacy of bivariate models assuming linearity of the log odds was checked using Hosmer-Lemeshow goodness-of-fit tests. The threshold for statistical significance was set at P < .05. Statistical analyses were conducted using SAS version 9.4 (SAS Institute Inc).

Results

E-learning use by US internal medicine residency programs

Among the 368 internal medicine residency program PDs surveyed, 227 (61.7%) responded to the APDIM survey and 214 (58.2%) completed the e-learning section. There were no significant differences in publicly available variables between the responding and nonresponding programs (Table 1). Specific characteristics for responding programs are listed in Table 1.

Table 1 Characteristics of responders and nonresponders to the 2015 association of program directors in internal medicine national survey (N = 368)

Among the 214 programs with PDs who responded, 85 (39.7%) use synchronous e-learning at least sometimes, somewhat often, or very often (Fig. 1). In contrast to synchronous e-learning, more programs (153; 71.5%) use asynchronous e-learning at least sometimes, somewhat often, or very often. The most commonly reported e-learning approaches were locally developed PowerPoint slide shows with narration (147; 68.7%) and online modules from professional organizations (144; 67.3%).

Fig. 1
figure 1

Use of synchronous or asynchronous electronic learning (e-learning) by internal medicine residency programs (N = 214)

Regarding the resources available to support e-learning, 97 programs (45%) do not provide mobile devices to their residents (Table 2). The devices provided by programs are shown in Table 2. Most programs (119; 56%) do not have faculty development for e-learning, and many PDs (64; 30%) reported that their program’s faculty development for e-learning is insufficient. Most programs (168; 79%) do not have a budget to integrate e-learning into their educational curricula.

Table 2 Resources for electronic learning implementation (N = 214 responders)

The electronic tools or approaches used for engaging residents in more active learning included audience response via clickers (146; 68.2%), audience response via text messaging (34; 15.9%), and interactive whiteboard technology (18; 8.4%). Electronic tools supported various assessment and tracking activities, including procedure logs (12; 59.3%), medical knowledge examinations (117; 54.7%), clinical performance assessments (108; 50.5%), and direct observation assessments (94; 43.9%).

PD perceptions of e-learning effectiveness

PDs’ perceptions of the effectiveness of e-learning in promoting the attainment of various educational outcomes are shown in Additional file 1: Table S1. Factor analysis showed a unidimensional model (factor loadings ranging from 0.51 to 0.75) and overall internal consistency reliability (Cronbach α) of 0.82. Mean (SD) scores ranged from 3.01 (0.94) to 3.86 (0.72) on a 5-point scale, indicating slightly more positive than neutral feelings (Additional file 1: Table S1). The item “Electronic learning is useful to teach interpersonal skills” was rated lowest by PDs.

Using multiple ANOVA to adjust for all 5 PD characteristics simultaneously, higher overall PD perception of e-learning scores were found in women (mean [SD], 3.59 [0.48] vs 3.36 [0.55] for men; P = .003; Table 3).

Table 3 Associations between PD perceptions of electronic learning and PD characteristics (N = 214)

Associations between e-learning use and program characteristics

University-based programs were more likely to use synchronous e-learning than community-based, university affiliated programs (OR, 6.8 [95% CI, 1.9–24.7]]; P = .01) (Table 4). Both community-based (OR, 3.96 [95% CI, 1.12–14.08]) and university-based (OR, 3.3 [95% CI, 1.3–9.0]; P = .01) programs were more likely to use asynchronous e-learning compared with community-based programs with university affiliation. The odds of asynchronous e-learning use was higher in programs in the Midwest (OR, 3.3 [95% CI, 1.3–8.1]; P = .01) compared with the Northeast region. The odds of synchronous e-learning use was higher in programs with a budget for implementation (OR, 3.00 [95% CI, 1.04–8.68]; P = .04) and in programs with more international medical graduates (OR, 1.018 [95% CI, 1.001–1.035]; P = .04). There was an association between the PD’s perceived effectiveness score and use of asynchronous e-learning (OR, 3.78 [95% CI, 1.80–7.96]; P < .001).

Table 4 Odds of using electronic learninga by internal medicine residency programsb

Discussion

Our survey of US internal medicine residency PDs revealed that most programs use e-learning, although more programs use asynchronous than synchronous e-learning. Utilization of locally developed and externally developed e-learning resources is similar. Approximately half the programs provide residents with mobile devices. Most programs are underresourced for e-learning integration, and having a budget was associated with higher odds of e-learning use. PD perceptions of e-learning are lowest for teaching interpersonal skills. E-learning was more frequently used by programs that were university based, located in the Midwest, and led by PDs with more positive perceptions of e-learning.

We identified widespread use of e-learning (especially asynchronous) among US internal medicine residency programs. These findings underscore research in residency education that showed prevalent e-learning adoption among surgical and emergency medicine residents [16,17,18] and that showed online textbooks to be the most common resource for patient care among radiology residents. Among US medical schools, educational software utilization has increased since 1998 [28], which may reflect the growing popularity of portable electronic devices. The current study provides novel information about US internal medicine residency programs, including detail regarding types of e-learning used, e-learning resources, and associations between characteristics of programs and PDs with specific elements of e-learning. Future research could explore utilization of e-learning techniques more broadly, including interactivity of the format for delivery.

PDs perceived that e-learning is better for teaching medical knowledge than interpersonal skills. According to a meta-analysis, Internet-based curricula in the health professions outperformed no intervention for teaching knowledge, behaviors, and patient care outcomes [5]. Another meta-analysis concluded that virtual patient interventions, versus no intervention, positively affected knowledge, clinical reasoning, and skill outcomes [29]. PDs may perceive that e-learning provides a poor approximation of face-to-face interactions with patients and colleagues. However, research suggests that e-learning curricula may have some utility in teaching and reinforcing patient care outcomes [5, 29].

Female PDs had more favorable perceptions of e-learning than male PDs, and programs were more likely to use e-learning if their PDs had better perceptions of e-learning. This finding supports previous research showing that female internal medicine PDs had better perceptions of flipped classrooms, which often use e-learning [30]. Future studies should further address gender interactions regarding e-learning use in graduate medical education.

This study has limitations. Although the use of e-learning by the nonresponding programs is unknown, they were not significantly different from the responding programs based on information from publicly available databases. This study focused on internal medicine residency programs, which may not be generalizable to other specialties. All outcomes reflect PD perceptions of e-learning and recollection of facts about their programs, which are potential sources of error. The current study only surveyed program directors and resident and chief resident perceptions of e-learning remain unknown. Questions about e-learning at the bedside or through an electronic health record may have strengthened the study. Strengths of this study include use of a professional survey research center, a national sample, and a relatively high survey response rate.

Conclusions

This study has important implications. First, our survey suggests that modern US residency programs could be better resourced to integrate e-learning technologies. Second, the study provides insights into e-learning modalities used in a large cohort of residency training programs. Asynchronous e-learning was used more than synchronous, which may reflect busy resident schedules and duty-hour restrictions. Programs should consider portability and accessibility when implementing e-learning for residents. Third, programs use locally and externally developed e-learning resources with similar frequency. Professional societies and academic institutions involved with residency training should consider developing e-learning content that is widely available, which could reduce the cost of e-learning and allow programs with limited resources to participate. Fourth, in addition to online modules, e-learning includes support for live teaching activities. Therefore, programs should look beyond “modules” and “videos” and consider all possible e-learning applications. Finally, PD perceptions of e-learning are relatively moderate, despite evidence showing that e-learning positively affects knowledge acquisition and is approximately as effective as textbooks or lectures [5]. Future research should determine whether PD reluctance to adopt e-learning is based on unawareness of the evidence, perceptions that e-learning is expensive, or judgments about value versus effectiveness.

Abbreviations

ABIM:

American Board of Internal Medicine

ACGME:

Accreditation Council for Graduate Medical Education

ANOVA:

Analysis of variance

APDIM:

Association of Program Directors in Internal Medicine

OR:

Odds ratio

PD:

Program Director

References

  1. Means B, Toyama Y, Murphy R, Bakia M, Jones K. Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies. Washington, DC: US Department of Education; 2009. [revised 2010 Sept]

    Google Scholar 

  2. Pappano L. The year of the MOOC. New York: New York Times; 2012 p. ED26.

  3. Instructional Technology Council. [Internet] 2015 distance learning survey results. Available from: http://www.itcnetwork.org. Cited 16 Sep 2016.

  4. Cook DA, Triola MM. What is the role of e-learning? Looking past the hype. Med Educ. 2014;48(9):930–7.

    Article  Google Scholar 

  5. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA. 2008;300(10):1181–96.

    Article  Google Scholar 

  6. Cook DA, Levinson AJ, Garside S. Time and learning efficiency in internet-based learning: a systematic review and meta-analysis. Adv Health Sci Educ Theory Pract. 2010;15(5):755–70.

    Article  Google Scholar 

  7. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med. 2010;85(5):909–22.

    Article  Google Scholar 

  8. Sandars J. Cost-effective e-learning in medical education. In: Walsh K, editor. Cost effectiveness in medical education. Oxon: Radcliffe; 2010. p. 40–6.

    Google Scholar 

  9. Maloney S, Nicklen P, Rivers G, Foo J, Ooi YY, Reeves S, et al. A cost-effectiveness analysis of blended versus face-to-face delivery of evidence-based medicine to medical students. J Med Internet Res. 2015;17(7):e182.

    Article  Google Scholar 

  10. Cook DA. The value of online learning and MRI: finding a niche for expensive technologies. Med Teach. 2014;36(11):965–72.

    Article  Google Scholar 

  11. Stotzer RL, Fujikawa K, Sur J, Arnsberger P. Cost analysis of a distance education MSW program. J Teach Soc Work. 2013;33:357–68.

    Article  Google Scholar 

  12. Delgaty L. A critical examination of the time and workload involved in the design and delivery of an e-module in postgraduate clinical education. Med Teach. 2013;35(5):e1173–80.

    Article  Google Scholar 

  13. Bennett S, Maton K. Beyond the “digital natives” debate: towards a more nuanced understanding of students’ technology experiences. J Comput Assist Learn. 2010;26(5):321–31.

    Article  Google Scholar 

  14. Jones C, Shao B. The net generation and digital natives: implications for higher education. York: Higher Education Academy; 2011.

    Google Scholar 

  15. Oh E, Reeves TC. Generational differences and the integration of technology in learning, instruction, and performance. In: Spector JM, Merrill MD, Elen J, Bishop MJ, editors. Handbook of research on educational communications and technology. 4th ed. New York: Springer; 2014. p. 819–28.

    Chapter  Google Scholar 

  16. Mallin M, Schlein S, Doctor S, Stroud S, Dawson M, Fix M. A survey of the current utilization of asynchronous education among emergency medicine residents in the United States. Acad Med. 2014;89(4):598–601.

    Article  Google Scholar 

  17. Purdy E, Thoma B, Bednarczyk J, Migneault D, Sherbino J. The use of free online educational resources by Canadian emergency medicine residents and program directors. CJEM. 2015;17(2):101–6.

    Article  Google Scholar 

  18. Glass NE, Kulaylat AN, Zheng F, Glarner CE, Economopoulos KP, Hamed OH, et al. A national survey of educational resources utilized by the resident and associate society of the American College of surgeons membership. Am J Surg. 2015;209(1):59–64.

    Article  Google Scholar 

  19. Kirkpatrick D. Great ideas revisited: techniques for evaluating training programs: revisiting Kirkpatrick’s four-level model. Train Dev. 1996;50(1):54–9.

    Google Scholar 

  20. Census regions and divisions of the United States. [Internet] US Census Bureau. Available from: http://www2.census.gov/geo/pdfs/maps-data/maps/reference/us_regdiv.pdf. Cited 5 Dec 2016.

  21. Exam pass rates. [Internet] American Board of Internal Medicine; c2004–2016. Available from: http://www.abim.org/about/statistics-data/exam-pass-rates.aspx. Cited 5 Dec 2016.

  22. AMA residency & fellowship database. [Internet] American Medical Association; c1995–2016. Available from: https://www.ama-assn.org/life-career/search-ama-residency-fellowship-database. Cited 5 Dec 2016.

  23. Accreditation Council for Graduate Medical Education (ACGME)–Public. [Internet] ACGME; c2016. Available from: https://apps.acgme.org/ads/public. Cited 5 Dec 2016.

  24. Bloom BS. Taxonomy of educational objectives: the classification of educational goals: handbook 1: cognitive domain. New York: David Mackay; 1971.

    Google Scholar 

  25. Epstein RM. Assessment in medical education. N Engl J Med. 2007;356(4):387–96.

    Article  Google Scholar 

  26. Crossley J, Humphris G, Jolly B. Assessing health professionals. Med Educ. 2002;36(9):800–4.

    Article  Google Scholar 

  27. DeVellis RF. Scale development: theory and applications. Newbury Park: Sage; 1991. p. 120.

    Google Scholar 

  28. Kim S. The future of e-learning in medical education: current trend and future opportunity. J Educ Eval Health Prof. 2006;3:3.

    Article  Google Scholar 

  29. Cook DA, Erwin PJ, Triola MM. Computerized virtual patients in health professions education: a systematic review and meta-analysis. Acad Med. 2010;85(10):1589–602.

    Article  Google Scholar 

  30. Wittich CM, Agrawal A, Wang AT, Halvorsen AJ, Mandrekar J, Chaudhry S, et al. Flipped classrooms in graduate medical education: a national survey of residency program directors. Acad Med. In press

Download references

Acknowledgments

We are grateful for the support of the Association of Program Directors in Internal Medicine, members of the Survey Committee, and the residency PDs who completed this survey.

Funding

The authors report no external funding source for this study.

Availability of data and materials

The datasets generated and/or analyzed during the current study are available on the Association of Program Directors in Internal Medicine website, http://www.im.org/page/surveys-and-data.

Author information

Authors and Affiliations

Authors

Contributions

All authors made substantial contributions to the design, data collection, analysis, drafting, and final approval of the manuscript. All authors were involved with revising the manuscript for critically important intellectual content. All authors gave final approval of the version to be published. Dr. Wittich has agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Corresponding author

Correspondence to Christopher M. Wittich.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed by the Mayo Clinic Institutional Review Board (Identification number: 08–007125).

Consent for publication

Participation in the APDIM survey includes a statement of intent to publish.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional file

Additional file 1: Table S1.

Program Director Perceptions of E-Learning Survey Items. (DOCX 12 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wittich, C.M., Agrawal, A., Cook, D.A. et al. E-learning in graduate medical education: survey of residency program directors. BMC Med Educ 17, 114 (2017). https://0-doi-org.brum.beds.ac.uk/10.1186/s12909-017-0953-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/s12909-017-0953-9

Keywords