Abstract
Success in lung transplantation (LT) has been attributed to proper patient and donor selection, better preservation and surgical techniques, and experience in postoperative management. In 1995, we refined our perioperative management by implementing newer perioperative strategies with critical pathways and have reduced use of cardio-pulmonary bypass (CPB), thereby improving survival after LT. We compared survival, use of CPB, intubation, intensive care unit (ICU) stay, and hospital times between PRE (prior to 1995) and POST cohorts to analyze our changes in LT. The 1-and 3-year survival rates were 57% and 29% for PRE, and 86% and 62% for POST, p < 0.01. The intubation time and ICU and hospital length of stay were significantly reduced in the POST cohort. Also, the need for CPB was reduced by about 40% in the POST group.
Introduction
Since the early 1980s, nearly 10,000 patients with a variety of end-stage cardiopulmonary diseases have successfully undergone heart-lung (HLT), single-lung (SLT), double-lung (DLT), or bilateral sequential lung (BLT) transplantation (1). Approximately 70% of these procedures in the last decade have been isolated lung transplants (LT). The introduction of cyclosporine (CYA) and refinements in recipient and donor selection, organ preservation, and surgical implantation techniques, along with accumulated experience, have allowed LT to emerge as an acceptable therapeutic option for patients with irreversible lung disease.
During the early 1990s, a number of institutions worldwide began offering LT to those suffocating or suffering with pulmonary fibrosis, emphysema, cystic fibrosis, pulmonary hypertension, and other lung diseases. LT is the newest of the solid-organ transplant procedures and its success remains sobering. The 1-, 3-, and 5-year survival rates for isolated LT (n = 7021) are 71%, 55%, and 43%, respectively (1) (Figure 1). This survival curve has a steep decline in the first year, much different from that for kidney, liver, or heart transplantation. However, there is a slow decline beyond the first postoperative year, which approximates the decline seen in the other solid-organ transplant groups.
Although 100 other programs throughout the world began performing LT after 1988, we were the first to successfully perform LT in Louisiana in 1990 (2). Since the inception of our program, a number of recent advances have occurred. These advances have included better patient and donor selection, further improvements in preservation and surgical techniques, and optimizing intraoperative, immediate postoperative, and late postoperative management. In 1995, we implemented these advancements and developed critical pathways in an effort to lessen the sharp decline in survival observed in the first postoperative year.
Methods
From November 1990 to September 1998, 48 BLT, 39 SLT, and 1 HLT procedures were performed in 86 patients. The mean age ± standard deviation (SD) was 42 ± 15 years (range 8-70). Our population was 86% Caucasian and 14% African-American. The gender mix was 57% female and 43% male. The mean weight ± SD was 61 ± 15 kg (range 25 – 100).
One of the changes that was instituted beginning in 1995 was recipient selection. Suitable candidates were placed on the national transplant waiting list, and the selection criteria used reflect the international guidelines recently published (3). Basic donor selection criteria have been published elsewhere (4). The median waiting time from listing to transplantation was 36 days (range 1 – 324). The indications for LT are categorized by type and shown in Table 1.
Preservation and Surgical Technique
The surgical changes that have been implemented include modifications in preservation, refinements in the surgical techniques, and improvements in perioperative management. The donor lung preservation technique was modified to decrease injury to the new lung during or immediately after implantation. Following our standard rapid infusion of cold Euro-Collins crystalloid perfusate (EC), we administered an equal volume of the more viscid University of Wisconsin preservation solution (UW) (5).
In contrast to other forms of solid-organ transplantation, LT offers several options for lung replacement, including HLT, BLT, SLT, and living-related lobar transplantation. All LT procedures were performed with cardiopulmonary bypass (CPB) available on standby. For each procedure, the lung allograft was removed from the donor, preserved and cooled, transported to the recipient, and implanted and rewarmed, implying that significant damage might have occurred. A variety of strategies have evolved allowing us to minimize allograft injury during implantation and in the immediate postoperative period. Some of the strategies used include: the introduction of specialized pulmonary artery catheters capable of continuous cardiac output monitoring, the use of pulmonary vasodilators (i.e., prostaglandin El and nitric oxide), the alpha-agonist, phenylephrine, and inoconstrictors such as epinephrine and norepinephrine. This permitted less fluid administration to prevent alveolar flooding from pulmonary capillary endothelial disruption. Another important change was the use of low volume, pressure-limited ventilation to minimize alveolar disruption by reducing dynamic hyperinflation. These advances were simultaneously employed intraoperatively to optimize tissue oxygenation and right ventricular perfusion in order to reduce the need for CPB.
Rejection and Infection Prophylaxis
Other significant changes were the development of immunosuppressant and antimicrobial prophylaxis protocols. The standard triple-drug immunosuppressive regimen of cyclosporine, azathioprine, and steroids was used with equine anti-thymocyte globulin for induction. Perioperative antibiotic use varied according to the recipient's underlying disease and culture results of donor airway secretions sampled at implantation. Cefazolin was used alone preoperatively and intraoperatively in recipients without suppurative lung disease, and vancomycin was used in those with a history of immediate allergic reactions to beta-lactams. Two antipseudomonal agents, including an aminoglycoside (most commonly tobramycin), were used perioperatively in all of the recipients with suppurative lung disease. The choice of agents was based on results from their most recent sputum culture. Duration of antibiotics varied according to the early postoperative course. All of the following were used for opportunistic prophylaxis after LT: trimethoprim/sulfamethoxazole against Pneumocystis carinii pneumonia, nystatin or clotrimazole against mucosal candidiasis, and ganciclovir for those with either donor or recipient seropositivity for cytomegalovirus. Those with donor and recipient seronegativity for cytomegalovirus received acyclovir.
Critical Pathway and Postoperative Management
A critical pathway beginning on postoperative day (POD)-1 and finishing on POD 10 was established and implemented in 1995 (Table 2). Goals were set and any significant deviation from those goals was subject to review. Outpatient follow-up was established with twice-a-week clinic visits, then weekly, twice-a-month, every third week, and monthly with increasing intervals according to the recipient's overall stability. Nearly all recipients were seen in the outpatient clinic every 2-3 months beyond the sixth postoperative month. A history and physical examination and a complete blood count, basic metabolic panel, and a 12-hr trough cyclosporine level measured by the TDx cyclosporine monoclonal whole blood immunofluorescence assay (Abbott Laboratories, North Chicago, IL) were obtained at each clinic visit. Simple spirometry, with resting and exercise pulse oximetry, was performed according to American Thoracic Society guidelines at each clinic visit for surveillance of allograft function (6). Bronchoscopy with bronchoalveolar lavage and transbronchial lung biopsies was not performed in a surveillance fashion. Bronchoscopy was indicated when a 10% decline in the forced expiratory volume in 1-second (FEVI) during a forced vital capacity maneuver from a previously established baseline was seen and persisted in 2 consecutive clinic visits.
Outcome Measures
Our database has been established in a prospective fashion and divided into eras before (PRE) and after (POST) the implementation of the changes described above on January 1, 1995. The differences between these 2 eras in survival, use of cardiopulmonary bypass, and postoperative time spent mechanically ventilated in the intensive care unit (ICU) and in the hospital were reviewed.
Statistical Analysis
All analyses were performed using Statview SE + Graph software (Abacus Concepts, Inc, Berkeley, CA) for the Macintosh computer. Continuous data were compared by the Mann-Whitney test and categorical data were analyzed by Fisher's exact test. Survival was determined by the Kaplan-Meier product-limit method and compared with the Breslow-Gehan-Wilcoxon test. Significance was defined as p < 0.05. Data are denoted as mean ± SEM, unless otherwise indicated.
Results
Between 1991 and 1997, data from 6,077 BLT and SLT procedures were submitted to the registry of the International Society for Heart and Lung Transplantation (1). This information was separated into two eras, 1991 – 1993 and 1994 – 1997, and the estimated survival and half-life of LT recipients are shown in Figure 1. The estimated survival and half-life of the LT recipients from our program are shown in Figure 2. The mean time of follow-up was 22 ± 2.1 months (range 1 – 72 months). No patients were lost to follow-up. Our group was divided into two eras as described in the methods, PRE (n = 21) and POST (n = 65). The estimated survival and half-life of these eras are shown in Figure 2. The POST group had a significantly better estimated survival than the PRE group, p < 0.01. More than half the deaths were observed in the first 90 days. The causes of death are detailed in Table 3. Before POD 90, 38% of the PRE cohort and 11% of the POST cohort were not alive. All deaths beyond 90 days were due to either obliterative bronchiolitis or infection.
Table 4 compares the need for CPB during implantation and the time spent on a ventilator, in the ICU, and in the hospital. There was a significant reduction in the use of CPB in the POST era. More striking, there was a significant decrease in the number of days on a ventilator and in the ICU between the PRE and POST groups, p < 0.01. Moreover, there was nearly a 50% reduction in the median length of stay in the hospital, from 25 to 13 days, between the two groups, p = 0.02. Also, the survival curve for those LT recipients who required (CPB+) was compared with those who did not require (CPB-). The 1-and 5-year survival rate of 63% and 39% in the CPB+ cohort was significantly lower than the 83% and 62% in the CPB-cohort, p = 0.03.
Discussion
Careful inspection of the survival curves from the registry (Figure 1) shows a precipitous decline in survival in the first year, especially in the first 3 months after LT. These results are disappointing when compared with the 1-year survival rates in other forms of solid organ transplantation, What is more disturbing is the lack of improvement in survival between the 1994-1997 and 1991-1993 eras. Through the evolution of LT, a number of explanations have been entertained for the lack of initial success in LT including: improper donor and recipient selection, inadequate preservation techniques, unrefined operative approaches, nature and degree of immunosuppression, and inability to detect rejection and infection, or ineffectiveness at distinguishing infection from rejection. Addressing these explanations has clearly resulted in reasonable outcomes for properly selected patients who would have otherwise succumbed to their underlying cardiopulmonaty disorder. However, little progress has been made to decrease the mortality rates observed in the first year when most of the deaths are centered on perioperative complications. The analysis of our two eras clearly demonstrates striking differences in the use of CPB, time spent on a ventilator, time in the ICU, and hospital length of stay. These outcome measures appear to have had an impact on our survival curves. But what factors influence perioperative mortality?
Criteria for determining if a potential donor is suitable for LT are well recognized and standardized (4). In fact, the donor selection criteria have not been altered since the inception of our program. However, the lack of improved survival between the 1991-1993 and 1994-1997 eras reported by the registry may be explained by relaxing some of the donor selection criteria, particularly donor age. Donor age was identified as an independent predictor of mortality (1). Other independent predictors of 1-and 5-yr mortality rates after LT by multivariate logistic regression include ventilator support, retransplantation, diagnosis other than emphysema, and recipient age. Some of these factors might explain the superimposition of the survival curves presented by the registry. Interestingly, only 35% of our recipients had emphysema, whereas emphysema is the leading indication for LT in the world.
Proper selection of candidates for LT has been considered one of the major reasons LT became a therapeutic option. A landmark paper from the Stanford group provided the framework for determining when a potential candidate was sick enough, well enough, and psychosocially capable of enduring such a heroic endeavor like LT (7). Most recently, an expert panel of physicians and surgeons has provided guidelines for selecting the most appropriate candidates for LT with emphasis on the resource limitations and the importance of assuring optimal outcomes (3).
Acute allograft dysfunction, characterized by pulmonary edema and hypoxemia, prolongs the requirement for ventilatory support, contributes to postoperative morbidity, and increases ICU and hospital length of stay. This would therefore appear to contribute to increasing perioperative mortality. We instituted the sequential use of EC and UW solutions as our donor lung preservation technique in an effort to stabilize capillary permeability, decrease lung water, and reduce pulmonary vascular resistance during procurement. Moreover, we expected maintenance of these effects during transport, implantation, and reperfusion. Adoption of this technique has appeared to reduce the incidence and severity of postoperative allograft reperfusion injury and subsequent need for CPB or prolonged intubation. This might be one of the major explanations for such improvement in survival in our POST cohort. It may also account for our improved outcomes above that reported by the registry. Recent research supports that further progress in allograft preservation may be more promising (8).
The use of partial CPB during LT is indicated and can be life-saving when severe hypoxemia, or shock, with or without severe pulmonary arterial hypertension, is encountered and not stabilized in a short time frame during single lung ventilation, or after pulmonary artery clamping. The use of the heart-lung machine mandates a fluid prime, which results in an increased volume load to the patient and predisposes to the sequelae of reperfusion pulmonary edema as mentioned above (9, 10). Another advance we implemented, once the postoperative morbidity related to CPB was recognized, was an aggressive protocol involving prostaglandin E1, phenylephrine, and epinephrine in an effort to prevent using CPB and excessive fluid administration intraoperatively. This effort was one of the measures employed that distinguished the PRE cohort from the POST cohort, and we believe had the greatest impact on our results. In effect there was a 40% reduction in the use of CPB. The impact CPB had on survival is quite evident by comparing the outcomes of CPB+ and CPB-groups. By inference, decreasing the use of CPB may have been the dominant measure favorably influencing our recent outcomes.
There are conflicting reports in the literature with regard to CPB, all of which are retrospective reviews. One report found that CPB was not associated with increased time requiring mechanical ventilation, increased time in the ICU, and increased time requiring oxygen supplementation (11). On the other hand, the Pittsburgh group showed a higher 1-month mortality rate in their CPB cohort (9). However, a recent report from the same group revealed there was worse oxygenation, more severe pulmonary infiltrates, a higher incidence of diffuse alveolar damage, and longer intubation times in their CPB cohort with no difference in the 1-year survival (12). A limitation to this study was that over two-thirds of their patients requiring CPB had pulmonary hypertension. This fact could also flaw our study, but a subset analysis of our data, minus those transplanted with pulmonary vascular disease (n = 5), revealed similar outcome results with significant differences between the PRE and POST cohorts (Table 4). Essentially all patients who underwent LT with pulmonary vascular disease were placed on CPB. Whether CPB is the main cause of early deaths in our program or simply a marker of intraoperative instability is uncertain. Certainly, CPB appears to be a major contributor to the perioperative complications and has had an apparent impact on our survival rates.
Bronchial anastomotic complications continue to be a major problem after LT. These anastomotic complications have been annoying to us and quite distressing to our recipients, with an incidence at one point as high as 21%. We have been dissatisfied with this incidence and have attempted to reduce our bronchial complication rate. We have abandoned our interrupted suture and bronchial telescoping techniques in favor of an end-to-end anastomosis with a continuous suture technique as has been suggested elsewhere (13). We have also further modified our technique by fashioning a very short donor bronchus. There have been no bronchial complications such as airway stenosis, dehiscence, fistula formation, or angulation since these changes have been implemented.
The immunosuppressant and infection prophylaxis regimens used were implemented in conjunction with the critical pathways. The rationale for these protocols was the result of a compilation of the existing literature on the management of rejection and infection (14-20). The only difference is that surveillance transbronchial biopsy was not performed. The development of a critical pathway for LT is novel, particularly when it includes and influences some of the nonsurgical intraoperative management. Another recent advance has included our use of the p-450 inhibitor, clarithromycin, to decrease the metabolism of cyclosporine in an effort to decrease dosing and thus cut costs (21).
In summary, standardizing proper recipient and donor selection, refining preservation and surgical techniques, and establishing effective rejection and infection protocols are crucial to the success of LT. Minimizing CPB, incorporating critical pathways, and particular attention to the perioperative management and ongoing auditing of any deviations from expected outcomes may have been substantial contributors to our outstanding results. Nevertheless, our recipients are still plagued with the infectious complications and side effects of the medications required for LT as well as the possibility of developing chronic allograft injury, namely obliterative bronchiolitis, the leading cause of late mortality. On the horizon are a number of newer agents that have become available and appear to be promising for minimizing the morbidity and mortality rates in LT. They include tacrolimus (FK506), sirolimus (rapamycin), mycophenolate mofetil, and leflunomide.
- Academic Division of Ochsner Clinic Foundation
References
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
- 11.
- 12.
- 13.
- 14.
- 15.
- 16.
- 17.
- 18.
- 19.
- 20.
- 21.