Gaming used as an informal instructional technique: effects on learner knowledge and satisfaction.
Jeopardy!, Concentration, quiz bowls, and other gaming formats have been incorporated into health sciences classroom and online education. However, there is limited information about the impact of these strategies on learner engagement and outcomes. To address this gap, we hypothesized that gaming would lead to a significant increase in retained short- and long-term medical knowledge with high learner session satisfaction.
Using the Jeopardy! game show model as a primary instructional technique to teach geriatrics, 8 PGY2 General Surgery residents were divided into 2 teams and competed to provide the "question" to each stated "answer" during 5 protected block curriculum units (1-h/U). A surgical faculty facilitator acted as the game host and provided feedback and brief elaboration of quiz answers/questions as necessary. Each quiz session contained two 25-question rounds. Paper-based pretests and posttests contained questions related to all core curriculum unit topics with 5 geriatric gaming questions per test. Residents completed the pretests 3 days before the session and a delayed posttest of geriatric topics on average 9.2 weeks (range, 5-12 weeks) after the instructional session. The cumulative average percent correct was compared between pretests and posttests using the Student t test. The residents completed session evaluation forms using Likert scale ratings after each gaming session and each protected curriculum block to assess educational value.
A total of 25 identical geriatric preunit and delayed postunit questions were administered across the instructional sessions. The combined pretest average score across all 8 residents was 51.5% for geriatric topics compared with 59.5% (p = 0.12) for all other unit topics. Delayed posttest geriatric scores demonstrated a statistically significant increase in retained medical knowledge with an average of 82.6% (p = 0.02). The difference between delayed posttest geriatric scores and posttest scores of all other unit topics was not significant. Residents reported a high level of satisfaction with the gaming sessions: The average session content rating was 4.9 compared with the overall block content rating of 4.6 (scale, 1-5, 5 = Outstanding).
The quiz type and competitive gaming sessions can be used as a primary instructional technique leading to significant improvements in delayed posttests of medical knowledge and high resident satisfaction of educational value. Knowledge gains seem to be sustained based on the intervals between the interventions and recorded gains.
Webb TP
,Simpson D
,Denson S
,Duthie E Jr
... -
《-》
Boot cAMP: educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship.
Preparatory training for new trainees beginning residency has been used by a variety of programs across the country. To improve the clinical orientation process for our new postgraduate year (PGY)-1 residents, we developed an intensive preparatory training curriculum inclusive of cognitive and procedural skills, training activities considered essential for early PGY-1 clinical management. We define our surgical PGY-1 Boot Camp as preparatory simulation-based training implemented at the onset of internship for introduction of skills necessary for basic surgical patient problem assessment and management. This orientation process includes exposure to simulated patient care encounters and technical skills training essential to new resident education. We report educational results of 4 successive years of Boot Camp training. Results were analyzed to determine if performance evidenced at onset of training was predictive of later educational outcomes.
Learners were PGY-1 residents, in both categorical and preliminary positions, at our medium-sized surgical residency program. Over a 4-year period, from July 2007 to July 2010, all 30 PGY-1 residents starting surgical residency at our institution underwent specific preparatory didactic and skills training over a 9-week period. This consisted of mandatory weekly 1-hour and 3-hour sessions in the Simulation Center, representing a 4-fold increase in time in simulation laboratory training compared with the remainder of the year. Training occurred in 8 procedural skills areas (instrument use, knot-tying, suturing, laparoscopic skills, airway management, cardiopulmonary resuscitation, central venous catheter, and chest tube insertion) and in simulated patient care (shock, surgical emergencies, and respiratory, cardiac, and trauma management) using a variety of high- and low-tech simulation platforms. Faculty and senior residents served as instructors. All educational activities were structured to include preparatory materials, pretraining briefing sessions, and immediate in-training or post-training review and debriefing. Baseline cognitive skills were assessed with written tests on basic patient management. Post-Boot Camp tests similarly evaluated cognitive skills. Technical skills were assessed using a variety of task-specific instruments, and expressed as a mean score for all activities for each resident. All measurements were expressed as percent (%) best possible score. Cognitive and technical performance in Boot Camp was compared with subsequent clinical and core curriculum evaluations including weekly quiz scores, annual American Board of Surgery In-Training Examination (ABSITE) scores, program in-training evaluations (New Innovations, Uniontown, Ohio), and operative assessment instrument scores (OP-Rate, Baystate Medical Center, Springfield, Massachusetts) for the remainder of the PGY-1 year.
Performance data were available for 30 PGY-1 residents over 4 years. Baseline cognitive skills were lower for the first year of Boot Camp as compared with subsequent years (71 ± 13, 83 ± 9, 84 ± 11, and 86 ± 6, respectively; p = 0.028, analysis of variance; ANOVA). Performance improved between pretests and final testing (81 ± 11 vs 89 ± 7; p < 0.001 paired t test). There was statistically significant correlation between Boot Camp final cognitive test results and American Board of Surgery In-Training Examination scores (p = 0.01; n = 22), but not quite significant for weekly curriculum quiz scores (p = 0.055; n = 22) and New Innovations cognitive assessments (p = 0.09; n = 25). Statistically significant correlation was also noted between Boot Camp mean overall skills and New Innovations technical skills assessments (p = 0.002; n = 25) and OP-Rate assessments (p = 0.01; n = 12).
Individual simulation-based Boot Camp performance scores for cognitive and procedural skills assessments in PGY-1 residents correlate with subjective and objective clinical performance evaluations. This concurrent correlation with multiple traditional evaluation methods used to express competency in our residency program supports the use of Boot Camp performance measures as needs assessment tools as well as adjuncts to cumulative resident evaluation data.
Fernandez GL
,Page DW
,Coe NP
,Lee PC
,Patterson LA
,Skylizard L
,St Louis M
,Amaral MH
,Wait RB
,Seymour NE
... -
《-》