Abstract
Background Process improvement (PI) science is relatively new to healthcare and has only recently been introduced to medical education. Most residency faculty lack training or experience in PI science activities. We assessed the impact of PI science education on the knowledge and attitudes of a group of residency and fellowship program directors and associate program directors using their respective Accreditation Council for Graduate Medical Education annual program evaluations (APEs) as an experiential object.
Methods For this pre/post study, 16 program directors and 7 associate program directors were surveyed before and after 4 didactic modules. The APEs for the 2 years prior to the intervention and in the fall after the intervention were analyzed. Mentoring in the use of these skills in the preparation of the APEs was provided.
Results The participants demonstrated improved knowledge in some areas and increased awareness of deficits in other areas. APE quality did not show consistent improvement following the intervention.
Conclusion The PI science knowledge and skill gaps of program directors and associate program directors are likely to impact the content and success of residency curricula. The designed PI science curriculum was slightly effective. Using the APE as the experiential object was convenient, but the APE was not the best project for a PI exercise. New, effective strategies and interventions to develop expertise in PI science are important as programs grapple with meeting new requirements, ensuring quality programs, and preparing residents and fellows for practice.
INTRODUCTION
Process improvement (PI) science originated with Shewhart and was further developed by Deming.1 Since 2009, these concepts and tools have been introduced to healthcare.1-3 Medical education has begun to recognize PI science expertise as critical to 21st century medical practice.
The Accreditation Council for Graduate Medical Education (ACGME) now mandates that residents be taught process and quality improvement, and the instruction must include experiential learning opportunities.4-12 Berwick and Finkelstein assert the need for residency programs to teach PI science and to ensure residents' participation in team-based improvement of real-world health systems.13 Most residency and fellowship program directors and core faculty, however, lack education or experience in PI science.14
Graduate medical education (GME) programs are also charged with faculty and resident requirements for scholarly activity. Both the opportunities for and challenges of using process and quality improvement projects as scholarly and research activities have been described.15-17 Additionally, the need to transition the ACGME-required annual program evaluation (APE) from a static snapshot to the documentation of a dynamic process has been described.18-21
The purpose of this study was to assess the impact of a PI science curriculum for GME program directors and associate program directors at a 645-bed tertiary care hospital in Park Ridge, IL, by measuring preintervention and postintervention knowledge, attitudes, and APE quality.
METHODS
This study utilized a pre/post design. The study population consisted of the 18 residency and fellowship program directors and associate program directors of our hospital's 11 sponsored residencies and fellowships and 2 affiliate programs. The authors developed and conducted this study as participants in the Alliance of Independent Academic Medical Centers (AIAMC) National Initiative III. The system's institutional review board approved the study.
Program directors and associate program directors were recruited via email. The email described the aim of the study, included the link to the survey, and was sent by the hospital's GME office to indicate institutional sponsorship and encourage participation.
The intervention consisted of 4 educational modules. For the participants' convenience, the 30- to 45-minute sessions immediately followed the standing April, June, and October program directors' meetings and the July GME Committee meeting in 2012. Continuing medical education credit was offered to attendees on completion of all 4 sessions. Two authors (JAG and PH) presented the modules. Module 1 introduced the program and the improvement target, the APE. Module 2 provided an introduction to PI science. Module 3 addressed the application of PI science to the APE. Module 4 connected PI work to scholarship and publication and introduced the SQUIRE (Standards for QUality Improvement Reporting Excellence) guidelines.
Assignments followed the first 3 modules. After module 1, participants were to review their program's 2011 APE. Following module 2, participants were to create a process map or fishbone diagram of their APE preparation process. After the third module, participants prepared and submitted their APEs. To support the learners, a SharePoint (Microsoft Corporation) site was created and served as a repository for module slides, copies of relevant articles, PI tools, the Advocate Lutheran General Hospital APE template, and samples of past APEs. An online forum for participant discussion was also provided.
The study investigators designed and piloted 2 instruments to measure the impact of the intervention. The first was a confidential survey assessing participants' knowledge and attitudes that was distributed via SurveyMonkey (www.surveymonkey.com). Participants provided electronic consent prior to accessing the survey. The preintervention survey was administered during March and April 2012. The postintervention survey was distributed and completed between mid-October and mid-November 2012.
The second instrument, developed and refined through pilot testing, was designed to measure the quality of the APEs. Five independent reviewers were recruited and trained by one author (MAC) in the rationale and recommended template for the APE and in the use of the quality evaluation tool. To establish reliability among the reviewers, a pilot evaluation of 5 APEs was conducted prior to the evaluation of the preintervention (2010 and 2011) and postintervention (2012) APEs.
All results are presented as numbers and percentages (when applicable). No pre/post inferential comparisons were performed for the survey and the APE results because of the small sample size and lack of dependent data for several participants. Descriptive analyses were performed using SPSS Statistics v.19.0 for Windows (IBM Inc.).
RESULTS
Table 1 describes the participants' demographics. Eight program directors and 2 associate program directors completed the preintervention survey (n=10). Eight program directors and 5 associate program directors (n=13) completed the postintervention survey. Most participants in the preintervention and postintervention surveys reported being in their roles <5 years. Half the respondents in the preintervention phase and 30.8% in the postintervention phase reported being in their role for 1-2 years.
The preintervention and postintervention survey results are presented in Table 2. Respondents gave inconsistent answers before and after the intervention regarding the number of program meetings held to evaluate their curriculum. Opinions about the effectiveness and productivity of program meetings differed widely. After the intervention, more participants reported that better knowledge and processes for the APE would add value to their program (5 vs 1), while fewer participants reported having some experience with preparing an APE (3 vs 5). When asked to report their knowledge and experience using a Plan-Do-Study-Act approach to performance improvement, more participants reported having some experience applying the approach postintervention (8 vs 5). Postintervention, more participants reported having no idea what a fishbone diagram (7 vs 2) or a process map is (8 vs 4), but more reported having some experience with process mapping (5 vs 1) postintervention. After the intervention, more participants reported that they were unsure about what an AIM statement is (6 vs 1); however, none reported not knowing what an AIM statement is (0 vs 3). More participants responded that performance improvement measures are a balance of process and outcomes measures (7 vs 4). Also, more participants reported that better knowledge and processes for PI research projects would add value to their program (4 vs 1) and knew that PI projects can be research projects (6 vs 2). However, fewer participants reported having experience with PI research projects (2 vs 5) postintervention.
Information about the quality of the APEs is presented in Table 3. In the postintervention APEs, the reviewers noticed no consistent changes in the documentation of the different required elements (eg, chair, program director, and core faculty) or in the clarity of the program descriptions of two elements: measures itself against external norms and the required review is a value-added quality improvement activity. However, the evaluations noted improved clarity in program descriptions of the following: continuous PI; ongoing and effective faculty development; paying attention to individual performance and learning needs; and setting appropriate learning goals.
DISCUSSION
The AIAMC National Initiative III focused on the need for and challenges of faculty development in process and quality improvement. Many informal communications confirmed the importance of scholarly work in this area. This study supports the relevance, importance, and national context of such work. GME is in a period of great change, and PI science is both a required curricular component and an excellent approach for ensuring that curricular efforts and changes accomplish the necessary goals.
Our program directors and associate program directors lacked the knowledge and skills in PI science needed to direct and evaluate educational experiences in this domain. Also, the scholarly activity opportunities available to trainees and faculty in PI science had not been identified. PI science knowledge, attitudes, and skills for our subjects varied widely at baseline. The postintervention results demonstrated slightly improved PI science knowledge, attitudes, and skills for the participants. However, the small numbers, variable attendance, leadership changes in some programs, or a combination of these factors precluded statistical analysis about the effectiveness of the intervention. The increased number of “no idea” responses after the intervention to the questions about fishbone diagrams and process mapping may reflect the cohort change or may indicate that the intervention showed participants that they did not know what they thought they knew.
This study has several limitations that need to be carefully considered when interpreting the findings. The cohort available for our study was small, and leadership changes during the 6-month study period probably impacted our results. We had no opportunity to compare attendance at the educational sessions with knowledge reported on the postintervention survey or with scoring of the final APE. In designing the intervention, we chose the mandatory APE as a convenient assigned focus. The time frame precluded using group process to select an alternative improvement target. Knowledge and attitudes toward the APE are instructive but not surprising. The APE was perceived more as a task to be accomplished or a burden than an opportunity for assessing and improving programs. Mentoring was available but not mandated for the utilization of PI science in the preparation of the APE. Each program used very different processes to develop the APE. The uncertain future of the APE in the ACGME Next Accreditation System (NAS) was also identified as a challenge.
Attendance and engagement varied. Identifying convenient times and compelling motivations for GME leadership and core faculty posed a challenge. We were not able to measure either the utilization or the efficacy of the online forum. Institutional leadership provided passive support. Low levels of commitment were a challenge throughout the 6-month intervention period, although the sessions were scheduled to coincide with and follow existing meetings. The 6-month span might have been too long for optimal learning and engagement, and perhaps the 2-month interval between educational sessions adversely impacted momentum.
Alternative hypotheses for the study results must be considered. During the study time frame, the hospital's GME Committee emphasized improving program information form documentation rather than the APE process and documentation. The APEs were filed, but feedback was not provided to individual programs, and APE content was not shared among programs. No forum was available for disseminating best practices. The issue of interrater reliability among APE reviewers was addressed through training, but it is unlikely that this concern was eliminated.
Despite these limitations, this study addresses important and evolving areas in GME. Teaching teachers to facilitate resident and fellow learning about quality improvement and PI science is a recognized need. The opportunity and need for process and quality improvement scholarship in GME are immense and encompass clinical, educational, and administrative arenas. This study highlighted the following opportunities for our institution: (1) standardization of approaches to GME documentation requirements, such as the APE; (2) communication that transparently shares lessons learned in our GME programs; and (3) documentation of the need for faculty development in process and quality improvement. Our results have been shared with the local and system GME leadership with the recommendation to refine the educational modules and disseminate them to more faculty at our hospital and to GME faculty at our system's teaching hospitals.
Further study is needed to identify effective pedagogic approaches for this material. An extended session, possibly one day or one half-day, might avoid the identified problem of attendance. Increasing the variety of educational experiences, including group learning, one-on-one learning, and learner presentations might increase attention and engagement. Acknowledging and factoring in the variation in baseline knowledge and experience are important, but whether grouping participants by skill level or intentionally combining mixed skill levels is optimal is unknown. Information about colearning with residents is not available. PI science education requires learning by doing. Opportunities to involve faculty and residents in PI science projects should be identified and evaluated. Projects must be meaningful, feasible, time limited, and evaluable. This project highlights the need for more faculty development efforts in PI science and calls for multisite explorations of both successful and unsuccessful interventions.
CONCLUSION
Our independent academic medical center's GME leadership acknowledges gaps in the PI science knowledge base, skill sets, and attitudes. The knowledge factor is most readily addressed, but skills develop as PI science tools are applied. A robust curriculum that engages residents and fellows and fosters an environment of active projects requires faculty knowledge and expertise. Development of a culture that transforms PI science projects into scholarly products is the ultimate goal. Although our intervention demonstrated a modest improvement in knowledge about the APE, PI science, and process/quality improvement scholarship, it highlighted the need for faculty development in PI science. GME programs must develop, incorporate, and evaluate their PI science efforts and efficacy in PI science scholarship. The ACGME Milestones, Clinical Learning Environment Review visits, and NAS call for and can be enhanced by a strong PI science culture. As the practice environment of medicine transforms rapidly and radically, physicians will need PI science skills to provide care for patients and populations. System leadership will be crucial for developing and evaluating GME faculty in PI science. We must intentionally and systematically interrogate our work as educators to meet the tasks of preparing residents, faculty, and designated institutional officials for changing institutional climates and evolving ACGME directives.
This article meets the Accreditation Council for Graduate Medical Education and the American Board of Medical Specialties Maintenance of Certification competencies for Medical Knowledge, Interpersonal and Communication Skills, Systems-Based Practice, and Practice-Based Learning and Improvement.
ACKNOWLEDGMENTS
The Alliance of Independent Academic Medical Centers National Initiative III provided the impetus and team support for this project.
The authors gratefully acknowledge the contributions of the annual program evaluation reviewers: Cathy Canfield-Jepsen, MBA, Director of Medical Education, University of Illinois at Chicago; Lora Ferraro, MBA, Research Department, Northwestern University; Loreta Krutulis, MEd, Director of Medical Education, Advocate Christ Medical Center; Rebecca Mammoser, MBA, Director of Contracts and Finance in Medical Education, Advocate Health Care; Barbara White, MHA, Director of Medical Education, Advocate Illinois Masonic Medical Center. The authors also thank Dorothy Mefom, MHA, Data Design and Management Analyst, Advocate Health Care, for data entry.
Footnotes
The authors have no financial or proprietary interest in the subject matter of this article.
- © Academic Division of Ochsner Clinic Foundation