Abstract
Background: Changes in the Accreditation Council for Graduate Medical Education (ACGME) duty hour requirements have created significant monitoring responsibilities for institutions. This study explored the types of tracking systems used and determined for each type of tracking system the number of violations identified and the number of ACGME citations issued.
Methods: An 8-question, anonymous, electronic survey was sent to 3,275 residency program coordinators across 24 ACGME-accredited specialties nationwide. The survey was developed by the study investigators to gather data on the type of system used by programs, perceived advantages and disadvantages of the system, the number and types of violations identified, and subsequent ACGME citations for duty hour noncompliance.
Results: Of the 889 responses (27.1% response rate), 780 (87.7%) reported using an electronic system, while 94 (10.6%) used a manual system. Programs found electronic systems significantly superior on most characteristics, including accuracy, effectiveness, ease of use, reliability, reporting variety, and time investment (all P<0.001). Electronic systems identified significantly more violations than their manual counterparts; however, violation identification did not correlate with an increase in ACGME duty hour citations for programs using electronic systems (all P>0.05).
Conclusion: Although a relationship was seen between the tracking system and the number of violations identified, no significant relationship was detected between the system used and the number of citations issued by the ACGME. While programs have invested considerable time, effort, and expense in systems to track duty hours, the real meaning of the data collected and its value to programs, residents, the ACGME, and the healthcare system remains unclear.
INTRODUCTION
In July 2003, the Accreditation Council for Graduate Medical Education (ACGME) implemented common duty hour standards across all residency programs in an effort to promote high-quality education and safe patient care.1,2 The duty hour standards were further reduced in 2011 at the recommendation of the Institute of Medicine.3,4 Both sets of standards prompted many changes by programs and sponsoring institutions in clinical training, patient care activities, and the mechanism for duty hour monitoring and oversight.
The ACGME regularly monitors compliance with duty hour standards using a range of tools, including annual resident surveys;5 individual follow-up with programs whose surveys indicate substantial noncompliance; document reviews; interviews with residents, faculty, and program leadership during accreditation site visits; response to complaints about duty hour violations; and vesting responsibility to the sponsoring institution for monitoring and oversight of duty hour compliance.2 Various methodologies are used to track compliance with ACGME duty hour rules. One method––only accepted for specialties with a low likelihood of exceeding duty hour limits such as dermatology, allergy and immunology, ophthalmology, diagnostic radiology, psychiatry (after the first year), and preventive medicine––is periodic sampling.2 More commonly, resident duty hours are tracked using retrospective self-reported time cards, entered manually or through a computer program.6 In some cases, residency programs have adopted a real-time duty hour tracking method in an attempt to improve accuracy and resident compliance and to decrease administrative burden and costs.7 One way to implement real-time duty hour tracking is through residents' interactions with the electronic medical record.8 One surgical residency program used text messaging to track resident duty hours, and the method was associated with high levels of duty hour compliance and resident satisfaction.9
While each of these systems has its advantages and drawbacks, electronic or computerized systems have yielded greater accuracy compared to manual systems in the complex task of computing the metrics needed to identify violations of resident duty hours: calculation of weekly work hours averaged over 4 weeks, length of rest periods between duty hours, continuous duty hours worked, allowance of limited additional hours for continuity of care and education, and the total number of days free from patient care and educational obligations averaged over a 4-week time period.
The ACGME tracks and publicly reports several metrics related to resident duty hours. For 2003-2008, the years for which data are available, the ACGME Summary of Achievements shows that the 24+6-hour limit on continuous duty was the most frequent duty hour citation for programs, followed by the 80-hour weekly work limit. The specialties with the most duty hour citations during this 5-year period were anesthesiology, thoracic surgery, and surgery. The number of programs operating under a duty hour exception decreased from 68 programs in 2004-2005 to 39 programs in 2007-2008.
The ACGME does not, however, monitor or provide a standard for the type of tracking system that programs should use, nor does it specify the frequency with which specialties and programs should document duty hours and assess for the presence of resident duty hour violations. Given the lack of a standard methodology for collecting resident duty hours, we explored the relationship between the type of duty hour tracking system utilized by individual training programs and the occurrence of internally identified violations, as well as ACGME-generated citations for duty hour violations.
METHODS
In spring 2011, a voluntary and confidential web-based survey developed in SurveyMonkey (www.surveymonkey.com) was distributed to 3,275 coordinators who manage ACGME-accredited programs nationwide (Figure). The study investigators designed an 8-question survey to gather data on primary specialty, type of system used to collect and analyze resident duty hour data, frequency with which duty hours are documented by residents, perceived advantages and disadvantages of the type of system, number and type of violations identified, whether a site visit had occurred since implementation of the current system, and subsequent ACGME citations.
The results for each type of system were analyzed. Because only a small sample (1.7%) of respondents indicated the use of an electronic real-time system, data related to this type of system were excluded, and the analysis of the results compared data between manual and electronic systems. The sole respondent identifier was primary specialty.
Descriptive statistics were reported on categorical data as number and percent. Chi square or Fisher exact test was performed when necessary to measure statistical differences between duty hour tracking systems (manual vs electronic). A 0.05 two-tailed α level was considered statistically significant in all analyses. Analyses were performed with SPSS v19.0 software (IBM, Inc.).
RESULTS
A total of 3,275 surveys were sent to residency program coordinators across 24 ACGME-accredited specialties nationwide. Of the 889 (27.1%) responses to the survey, 780 (87.7%) of the programs used an electronic system, while 94 (10.6%) manually logged and tracked duty hours. Table 1 presents the number and percentage of respondents by specialty.
Table 2 displays perceived advantages from a program management perspective for electronic vs manual duty hour tracking systems. Significantly more respondents using electronic systems perceived data accuracy/integrity, effectiveness, ease of use, reliability, variety of reports, and time investment as advantages of their system (all P<0.001) compared to respondents using manual systems. However, significantly more respondents using manual systems perceived cost as an advantage of their system (P=0.010). No significant difference between the two groups was found concerning resident compliance with reporting duty hours (P=0.322).
Table 3 presents the comparison of duty hour violations and citations by system type. A significantly greater percentage of programs using electronic systems reported violations of the maximum 30-hour shift limit, the minimum 10-hour rest period between shifts, and failing to provide 1 day off in 7 (all P<0.05). Citation frequency was not significantly different based on type of reporting system (all P>0.05). As displayed in Table 4, programs tracking hours more frequently (daily/weekly) reported more violations than those tracking monthly/quarterly, 903 vs 225, respectively. Within the group tracking hours daily/weekly, programs using electronic systems reported significantly more violations for 3 of the 5 categories (maximum 30-hour shift, minimum 10 hours between shifts, and minimum 1 day off in 7; all P<0.05). Table 5 reveals that reported citations were slightly higher for programs using electronic systems, although the number of citations did not significantly differ between the two systems (all P>0.05).
Free-text answers in the form of comments to survey questions were analyzed to identify common themes. Responses to the question “What are the advantages and disadvantages of using this system from the program management's perspective?” noted the difficulty of getting residents to log data and/or expressed skepticism about the accuracy or meaningfulness of the data. The overall reaction to the question “Since implementation of your current tracking system, have you had any of the following duty hour violations?” was that violations were rare and violation problems were corrected immediately. Several responses indicated that violations occurred because of specialty, size of the program, rotations to other specialties, care of patients, or residents' answer choices.
DISCUSSION
Among respondents to our nationwide electronic survey, most programs use electronic tracking systems. These systems are favored by coordinators for data accuracy/integrity, effectiveness, ease of use, reliability, reporting capabilities, and time investment. The data confirmed our hypothesis that electronic systems are significantly more likely to identify duty hour violations. Contrary to our hypothesis, however, programs using electronic systems were not more likely to receive program citations for duty hour violations. The programs' sampling methods and frequencies varied widely, regardless of tracking system used.
While many studies have evaluated the consequences of the ACGME change in standards, most have attempted to only assess their effects on patient safety, resident education, and resident quality of life.10,11 To our knowledge, no other study has evaluated the systems used to track resident duty hours. Therefore, our findings provide a significant overview of the processes and outcomes of duty hour tracking systems.
This information from program coordinators paints a significant initial picture of the processes and outcomes of duty hour tracking. Program coordinators play a key role in assessing the effort and difficulties involved in tracking duty hours, the reliability of the data generated, and the effects of tracking on their programs and on their residents and fellows. Thus, because program coordinators' experience with tracking systems is crucial, evaluating their perspectives is vital to the successful implementation of these systems.
The ACGME and the residency review committees allow wide variation in the frequency of tracking and in the tracking systems used by training institutions. Both the ACGME and institutional graduate medical education committees may wish to know whether the costs of more sophisticated systems––and the burdens of more frequent data entry by residents and fellows––will further advance the goals of the rule changes or produce more noise and whether such noise increases the risk of program citations.
This study has several limitations. One major limitation is that the survey was not validated. The respondents could have interpreted the questions not only differently from each other but also from the way the authors intended. Moreover, the survey did not collect demographic data that could have provided insight on the responses. The study did not explore the severity of violations and citations and at what point a citation was issued. While all specialties are represented in the sample, the response rates for specialties such as diagnostic radiology, general surgery, family medicine, and obstetrics/gynecology were much smaller than those for internal medicine, neurology, and psychiatry, raising questions about the representativeness of the study findings to all ACGME specialties. Despite these limitations, this study provides an important depiction of the experience of programs tracking duty hours. Further research needs to demonstrate that the costs and burdens of alternative systems produce a better educational experience, resident quality of life, and improved patient safety.
CONCLUSION
Although a significant relationship was seen between the tracking system used and the number of violations identified, no significant relationship was detected between the tracking system and number of citations issued. Duty hour monitoring systems and methodologies appear quite inconsistent across programs. A loose approach could undermine the goals of the duty hour rule revisions, but an overly rigid approach could create an unnecessary burden for residents, increased cost, and the potential for unfair repercussions. While programs have invested considerable time, effort, and expense into tracking duty hours, further research is needed to determine the meaningfulness of the data collected and its value to programs, residents, the ACGME, and the healthcare system.
This article meets the Accreditation Council for Graduate Medical Education and the American Board of Medical Specialties Maintenance of Certification competencies for Patient Care, Medical Knowledge, Professionalism, and Systems-Based Practice.
ACKNOWLEDGMENTS
The authors have no financial or proprietary interest in the subject matter of this article.
- © Academic Division of Ochsner Clinic Foundation