Abstract
Background: Between 2011 and 2013, medical students at a large, tertiary academic hospital reported a lower-than-expected perception of direct observation and feedback during their third- and fourth-year clinical clerkships. The anesthesiology clerkship is a team-based care model that involves an anesthesiologist, resident or anesthetist, and student. This model allows for direct supervision of all patient interactions and procedures. Despite this structure, medical students reported an acceptable but lower-than-anticipated perception of direct observation and feedback taking place during a 2-week anesthesiology clerkship.
Methods: Interventions were proposed to improve student awareness of the supervision, teaching, and feedback taking place. A skills checklist for intravenous (IV) line placement that an anesthesia provider completed while observing the student was chosen as a meaningful intervention to improve the students' perception of observation and feedback. This checklist required direct observation of the IV line placement clinical skill, and the evaluator was directed to give oral feedback to the student. Students were surveyed regarding their perceptions of direct observation and feedback during a 4-year period, 2 years prior to and 2 years after implementation of the IV checklist.
Results: No statistically significant difference was noted between the preintervention and postintervention groups.
Conclusion: While formal observation of and feedback on an IV placement did not change student perception, the intervention showed that a more in-depth analysis of the “educational alliance” desired during an anesthesiology clerkship is warranted, especially as medical education continues to evolve.
INTRODUCTION
At a large academic center, medical students complete a 2-week required clerkship in anesthesiology during their third or fourth year of medical school. Emphasis is placed on preoperative workup, intravenous line (IV) placement, basics of airway management, pharmacology, and pain management. Students often work with a new anesthesia team each day because of the large number of providers, subspecialty exposure, and call schedule variability, making the development of an “educational alliance” challenging.1 On postclerkship surveys administered through the medical school, students were in agreement but not strong agreement that direct observation of and feedback on clinical skills were taking place in all clerkships, including the anesthesiology clerkship. While student ratings of the anesthesiology clerkship were acceptable overall, we noted a slight 2-year trend of decreasing survey responses of “strong agreement” that faculty observation and feedback delivery were taking place.
Observation and feedback are cornerstones of teaching clinical skills, but a disconnect exists between student perception of feedback and efforts made to improve the feedback process.2 Boehler et al found students less satisfied with feedback instead of compliment, even though their performance was worse after compliment compared to feedback.3 Quantifying the level of feedback needed to improve the perception of feedback is difficult,1,3 just as it is difficult to quantify the amount or quality of feedback needed to improve or harm performance.1 Medical education efforts involving the Objective Structured Assessment of Technical Skills (OSATS),4,5 the Mini-Clinical Evaluation Exercise (mini-CEX),6 and the Direct Observation of Procedural Skills (DOPS)7 have had some success in the observation and assessment portion of student performance in clinical or technical skills. However, the ideal feedback delivery vehicle remains an issue.1
During the anesthesiology clerkship, the educators' role includes teaching the Association of American Medical Colleges Core Entrustable Professional Activities of bag-mask ventilation and IV placement.8 The anesthesia team directly observes the performance of these core student skills daily in the operating room and during additional simulation sessions. Students participate in 2-hour simulation sessions for both airway management and IV placement on the first day of the rotation before patient contact. These simulations consist of repeated cycles of demonstration of the skill, observation of student performance, and feedback on performance. Despite the time spent teaching these activities, students were not in strong agreement that observation and feedback were taking place.
Most anesthesiology clerkships at academic centers have similarly structured clinical teaching environments and workflows. Therefore, our institution's experience regarding student perception of observation and feedback despite existing educational efforts is likely generalizable to other anesthesiology departments at large academic centers.1 Because of institution-wide initiatives to improve observation and feedback, coupled with a slight decrease in clerkship evaluations, we proactively intervened.
Multiple interventions were proposed. One proposal was to use an OSATS-style tool to grade an existing clinical simulation for managing bronchospasm, hypotension, and hypoglycemia.4 This simulation was an ungraded exercise that paired 2 to 4 students at a time, rather than assigning the tasks to an individual student. Changing the simulation to an OSATS evaluation would be difficult without additional investments in personnel and time. Another suggestion was to develop an Objective Structured Clinical Examination (OSCE). An OSCE would provide a formal grading exercise that could focus feedback on specific areas or skills. This type of exercise may be more appropriate for procedural skills than a simulation debriefing that tends toward generalizable concepts. However, an OSCE approach would also require a significant increase in time and resources for implementation but could be revisited in coordination with new medical school competency-based initiatives in the future.9 Another possible solution was to require daily feedback sessions with faculty, residents, or anesthetists and then to formalize the feedback on a paper evaluation. Multiple providers felt that this solution would be burdensome.
We favored a solution that would facilitate assessment at the bedside and delivery of feedback during or immediately after the skill or action was performed. This thinking led to the development of an IV placement checklist in the framework of the OSATS task-specific checklist but without the global rating portion.4,5 We hypothesized that student perception would improve with the implementation of a formal IV placement assessment during the anesthesiology clerkship because no formal skill assessment had previously existed.
METHODS
After institutional review board review and exemption, an objective, transparent, and simple checklist was designed to facilitate providing clinical observation of and feedback to medical students on an anesthesiology rotation (Figure). Given that the OSATS5 and mini-CEX6 are validated, reliable, clinical skill–based, procedure-focused tools in the medical education literature, we adapted a task-specific checklist because of its objectivity, low time burden, and acceptable reliability.10,11 Removing the global scoring from each skill made the checklist easier to use and required less training for evaluators.12,13 Students were instructed to ask an evaluator to complete the IV checklist during at least one IV placement attempt in any patient encounter.
Evaluators were faculty attending anesthesiologists, resident anesthesiologists, and anesthesiology assistants or certified registered nurse anesthetists. Prior to implementation of the checklist, directions for completing the tool were sent to all evaluators. Additionally, evaluators had an in-service meeting to disseminate the importance of the checklist and to improve familiarity with the checklist among all evaluators. To ensure compliance with the activity, students were mandated to turn in at least one evaluation prior to taking the end-of-clerkship examination, but they could ask evaluators to complete the checklist multiple times if they desired additional observation and feedback.12
We presumed that this intervention would improve students' perception of observation and feedback during the clerkship and would also result in the delivery of timely evaluator feedback to the student. Such feedback is essential for commenting on the nuances of the student's performance, areas that a checklist cannot qualitatively assess.14
Students were surveyed at the completion of the anesthesiology clerkship through the medical school in the 2 years prior to and the 2 years after implementation of the IV checklist. The postclerkship surveys from the postimplementation group evaluated the impact of this intervention. Three questions investigated the students' perceptions: (1) This clerkship provided adequate opportunities to practice my patient care skills; (2) Clerkship faculty personally observed me performing core clinical skills during the clerkship; and (3) This clerkship provided helpful feedback about my performance. Survey responses were on a 1 to 7 scale, with 1 corresponding to “strong agreement” and 7 corresponding to “strong disagreement” with each statement.14
The preintervention group (students in 2011-2012 and 2012-2013) and the postintervention group (students in 2013-2014 and 2014-2015) were compared with analysis of variance using SPSS v.17 (IBM Corp.). P≤0.05 was considered significant. The study groups consisted of all medical students taking the anesthesia clerkship since 2011 who filled out postclerkship surveys.
RESULTS
The preintervention group consisted of 305 students, and the postintervention group consisted of 298 students. Comparison of the group prior to intervention (2011-2012, 2012-2013) with the group after intervention (2013-2014, 2014-2015) showed no statistically significant difference in any of the 3 applicable survey questions (Table). Standard deviations and confidence intervals between groups were consistent.14
DISCUSSION
Efforts to implement a checklist for IV placement corresponded with a medical school–wide initiative for increased direct observation and feedback. The checklist was designed to facilitate this goal by requiring an evaluator to observe IV placement, complete an evaluation and return it to the student, and initiate a feedback conversation. However, after implementation of the IV checklist, students did not perceive that they were being directly observed or given feedback more than previously.15
Checklists for airway management and an anesthesia-focused history and physical were piloted concurrently during the postintervention years but had not yet been used for formal assessment. Based on the results from the IV checklist implementation, however, additional formal opportunities for assessment and feedback are not likely the solution for changing student perception.
Weinstein's commentary regarding the shift in physician education “from a loosely planned clinical immersion to a curriculum-based experience linked to achievement of specific competencies” is telling.2 As student learning environments and goals change, educators must adopt new strategies to ensure that instruction, observation, and feedback are received. Future efforts should focus on faculty development to provide teachers with the tools to succeed.1 Assessments will need to focus on competencies, increasing the reliance on OSCE-style examinations and simulations. Improved scheduling processes may aid in developing supervisor-trainee relationships during the brief anesthesiology clerkship and could perhaps lead to increased learner reception and satisfaction with teaching, observation, and feedback.1 Investigation into faculty attributes that trainees positively identify with may help guide faculty development efforts and create the proposed “educational alliance” relationship between faculty and trainees.1
CONCLUSION
While this intervention provided formal assessment of a specific skill during the anesthesia clerkship, it was unsuccessful at improving student perception of direct observation and feedback. Similar assessments of other clinical skills are likely of little utility in improving feedback perception. With medical education evolving into competency-based teaching and assessment, teaching strategies must also evolve to best meet these goals.
This article meets the Accreditation Council for Graduate Medical Education and the American Board of Medical Specialties Maintenance of Certification competencies for Medical Knowledge and Systems-Based Practice.
ACKNOWLEDGMENTS
Dr Alaa Abd-Elsayed is a consultant for Halyard, Medtronic, Axsome, Solis, Ultimaxx Health, and SpineLoop. Otherwise, the authors have no financial or proprietary interest in the subject matter of this article.
- © Academic Division of Ochsner Clinic Foundation