Author(s)
Harrison M. Thompson, MD
Gabriel G. Sobczak, MD
Sarah Toti, MD
Aditya Bhatt
Brett T. Comer, MD
Elisa A. Illing, MD
Affiliation(s)
Indiana University School of Medicine; University of Kentucky School of Medicine
Abstract:
Educational Objective: At the conclusion of this presentation, participants should articulate how a standardized didactic program can impact residents' perspectives about achieving educational outcomes.
Objectives: This study aimed to assess perceptions and evaluate various domains of learner experience during implementation of the otolaryngology core curriculum (OCC).
Study Design: Mixed methods prospective cohort study.
Methods: A 28 item questionnaire covering curriculum objectives, content, implementation, and learner evaluation was administered to 17 residents across two institutions. Pre- and post-OCC survey results were compared with student's two sample t-test; significance was set to a=0.05. Focus group discussions were held within 3 months of OCC implementation and at 6-9 months post-implementation. Transcription was performed and qualitatively analyzed utilizing inductive and deductive coding followed by thematic analysis.
Results: Survey results indicated that curriculum objectives were clearer for learners with OCC compared with existing didactic programs. Content was more relevant, organized, and delivered more effectively. OCC evaluation methodology was favored by learners. Didactic duration, timing, and instructor communication improved significantly after OCC. Achievement of learning objectives and content availability relevant to learners at multiple levels did not change significantly. Thematic analysis identified four major themes when evaluating learner experience and preferences of didactic programs: 1) learner directed education, 2) expert oversight, 3) program structure, and 4) practice applicable learning.
Conclusions: In this small cohort, OCC implementation generally yielded positive educational outcomes from the learner's perspective and allowed for modification to fit each program and learner's needs. Further evaluation utilizing question bank or national test scores would improve outcomes analysis of implementation.