Implementation of an objective structured clinical exam (OSCE) into orthopedic surgery residency training.
While the musculoskeletal (MSK) physical examination (PE) is an essential part of a patient encounter, we believe it is an underemphasized component of orthopedic residency education and that resident PE skills may be lacking. The purpose of this investigation was to (1) assess the attitudes regarding PE teaching in orthopedic residencies today; (2) develop an MSK objective structured clinical examination (OSCE) to assess the MSK PE knowledge and skills of our orthopedic residents.
Prospective, uncontrolled, observational.
A major Midwestern tertiary referral center and academic medical center.
The orthopedic surgery residents in our program. Twenty-two of 24 completed the OSCE.
Surveys showed that residents agreed that although learning the PE is important, there is not enough time in clinic to actually observe and critique a resident examining a patient. For the 22 residents (postgraduate year [PGY] 2-5) who participated in the OSCE, the overall score was 66%. Scores were significantly better for the trauma scenario (78%; p < 0.05) than for the shoulder (67%), spine (64%), and knee (59%) encounters. The overall scores for each component of the OSCE were: (1) history 53%; (2) PE 60%; (3) 5-question posttest 64%; and (4) communication skills 90%.
We have exposed a deficiency in the PE knowledge and skills of our residents. Clinic time alone may be insufficient to both teach and learn the MSK PE. The use of a MSK OSCE, while novel in orthopedics, will allow more direct observation of our residents MSK PE skills and also allow us to follow resident skills longitudinally through their training. We hope that our efforts will encourage other programs to assess their PE curriculum and perhaps prompt change.
Griesser MJ
,Beran MC
,Flanigan DC
,Quackenbush M
,Van Hoff C
,Bishop JY
... -
《-》
Using Objective Structured Clinical Examinations to Assess Intern Orthopaedic Physical Examination Skills: A Multimodal Didactic Comparison.
Patient care is 1 of the 6 core competencies defined by the Accreditation Council for Graduate Medical Education (ACGME). The physical examination (PE) is a fundamental skill to evaluate patients and make an accurate diagnosis. The purpose of this study was to investigate 3 different methods to teach PE skills and to assess the ability to do a complete PE in a simulated patient encounter.
Prospective, uncontrolled, observational.
Northeastern academic medical center.
A total of 32 orthopedic surgery residents participated and were divided into 3 didactic groups: Group 1 (n = 12) live interactive lectures, demonstration on standardized patients, and textbook reading; Group 2 (n = 11) video recordings of the lectures given to Group 1 and textbook reading alone; Group 3 (n = 9): 90-minute modules taught by residents to interns in near-peer format and textbook reading.
The overall score for objective structured clinical examinations from the combined groups was 66%. There was a trend toward more complete PEs in Group 1 taught via live lectures and demonstrations compared to Group 2 that relied on video recording. Near-peer taught residents from Group 3 significantly outperformed Group 2 residents overall (p = 0.02), and trended toward significantly outperforming Group 1 residents as well, with significantly higher scores in the ankle (p = 0.02) and shoulder (p = 0.02) PE cases.
This study found that orthopedic interns taught musculoskeletal PE skills by near-peers outperformed other groups overall. An overall score of 66% for the combined didactic groups suggests a baseline deficit in first-year resident musculoskeletal PE skills. The PE should continue to be taught and objectively assessed throughout residency to confirm that budding surgeons have mastered these fundamental skills before going into practice.
Phillips D
,Pean CA
,Allen K
,Zuckerman J
,Egol K
... -
《-》
Boot cAMP: educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship.
Preparatory training for new trainees beginning residency has been used by a variety of programs across the country. To improve the clinical orientation process for our new postgraduate year (PGY)-1 residents, we developed an intensive preparatory training curriculum inclusive of cognitive and procedural skills, training activities considered essential for early PGY-1 clinical management. We define our surgical PGY-1 Boot Camp as preparatory simulation-based training implemented at the onset of internship for introduction of skills necessary for basic surgical patient problem assessment and management. This orientation process includes exposure to simulated patient care encounters and technical skills training essential to new resident education. We report educational results of 4 successive years of Boot Camp training. Results were analyzed to determine if performance evidenced at onset of training was predictive of later educational outcomes.
Learners were PGY-1 residents, in both categorical and preliminary positions, at our medium-sized surgical residency program. Over a 4-year period, from July 2007 to July 2010, all 30 PGY-1 residents starting surgical residency at our institution underwent specific preparatory didactic and skills training over a 9-week period. This consisted of mandatory weekly 1-hour and 3-hour sessions in the Simulation Center, representing a 4-fold increase in time in simulation laboratory training compared with the remainder of the year. Training occurred in 8 procedural skills areas (instrument use, knot-tying, suturing, laparoscopic skills, airway management, cardiopulmonary resuscitation, central venous catheter, and chest tube insertion) and in simulated patient care (shock, surgical emergencies, and respiratory, cardiac, and trauma management) using a variety of high- and low-tech simulation platforms. Faculty and senior residents served as instructors. All educational activities were structured to include preparatory materials, pretraining briefing sessions, and immediate in-training or post-training review and debriefing. Baseline cognitive skills were assessed with written tests on basic patient management. Post-Boot Camp tests similarly evaluated cognitive skills. Technical skills were assessed using a variety of task-specific instruments, and expressed as a mean score for all activities for each resident. All measurements were expressed as percent (%) best possible score. Cognitive and technical performance in Boot Camp was compared with subsequent clinical and core curriculum evaluations including weekly quiz scores, annual American Board of Surgery In-Training Examination (ABSITE) scores, program in-training evaluations (New Innovations, Uniontown, Ohio), and operative assessment instrument scores (OP-Rate, Baystate Medical Center, Springfield, Massachusetts) for the remainder of the PGY-1 year.
Performance data were available for 30 PGY-1 residents over 4 years. Baseline cognitive skills were lower for the first year of Boot Camp as compared with subsequent years (71 ± 13, 83 ± 9, 84 ± 11, and 86 ± 6, respectively; p = 0.028, analysis of variance; ANOVA). Performance improved between pretests and final testing (81 ± 11 vs 89 ± 7; p < 0.001 paired t test). There was statistically significant correlation between Boot Camp final cognitive test results and American Board of Surgery In-Training Examination scores (p = 0.01; n = 22), but not quite significant for weekly curriculum quiz scores (p = 0.055; n = 22) and New Innovations cognitive assessments (p = 0.09; n = 25). Statistically significant correlation was also noted between Boot Camp mean overall skills and New Innovations technical skills assessments (p = 0.002; n = 25) and OP-Rate assessments (p = 0.01; n = 12).
Individual simulation-based Boot Camp performance scores for cognitive and procedural skills assessments in PGY-1 residents correlate with subjective and objective clinical performance evaluations. This concurrent correlation with multiple traditional evaluation methods used to express competency in our residency program supports the use of Boot Camp performance measures as needs assessment tools as well as adjuncts to cumulative resident evaluation data.
Fernandez GL
,Page DW
,Coe NP
,Lee PC
,Patterson LA
,Skylizard L
,St Louis M
,Amaral MH
,Wait RB
,Seymour NE
... -
《-》
Examination to assess the clinical examination and documentation of spine pathology among orthopedic residents.
The Accreditation Council for Graduate Medical Education (ACGME) guidelines requires residency programs to teach and evaluate residents in six overarching "core competencies" and document progress through educational milestones. To assess the progress of orthopedic interns' skills in performing a history, physical examination, and documentation of the encounter for a standardized patient with spinal stenosis, an objective structured clinical examination (OSCE) was conducted for 13 orthopedic intern residents, following a 1-month boot camp that included communications skills and curriculum in history and physical examination. Interns were objectively scored based on their performance of the physical examination, communication skills, completeness and accuracy of their electronic medical record (EMR), and their diagnostic conclusions gleaned from the patient encounter.
The purpose of this study was to meaningfully assess the clinical skills of orthopedic post-graduate year (PGY)-1 interns. The findings can be used to develop a standardized curriculum for documenting patient encounters and highlight common areas of weakness among orthopedic interns with regard to the spine history and physical examination and conducting complete and accurate clinical documentation.
A major orthopedic specialty hospital and academic medical center.
Thirteen PGY-1 orthopedic residents participated in the OSCE with the same standardized patient presenting with symptoms and radiographs consistent with spinal stenosis. Videos of the encounters were independently viewed and objectively evaluated by one investigator in the study. This evaluation focused on the completeness of the history and the performance and completion of the physical examination. The standardized patient evaluated the communication skills of each intern with a separate objective evaluation. Interns completed these same scoring guides to evaluate their own performance in history, physical examination, and communications skills. The interns' documentation in the EMR was then scored for completeness, internal consistency, and inaccuracies.
The independent review revealed objective deficits in both the orthopedic interns' history and the physical examination, as well as highlighted trends of inaccurate and incomplete documentation in the corresponding medical record. Communication skills with the patient did not meet expectations. Further, interns tended to overscore themselves, especially with regard to their performance on the physical examination (p<.0005). Inconsistencies, omissions, and inaccuracies were common in the corresponding medical notes when compared with the events of the patient encounter. Nine of the 13 interns (69.2%) documented at least one finding that was not assessed or tested in the clinical encounter, and four of the 13 interns (30.8%) included inaccuracies in the medical record, which contradicted the information collected at the time of the encounter.
The results of this study highlighted significant shortcomings in the completeness of the interns' spine history and physical examination, and the accuracy and completeness oftheir EMR note. The study provides a valuable exercise for evaluating residents in a multifaceted, multi-milestone manner that more accurately documents residents' clinical strengths and weaknesses. The study demonstrates that orthopedic residents require further instruction on the complexities of the spinal examination. It validates a need for increased systemic support for improving resident documentation through comprehensive education and evaluation modules.
Haglin JM
,Zeller JL
,Egol KA
,Phillips DP
... -
《-》