History of Liver and Other Splanchnic Organ Transplantation



Fig. 1
The complex of intraabdominal viscera that has been transplanted as a unit (center) or as its separate components: a, liver; b, pancreas; c, liver and intestine; d, intestine; and e, liver and pancreas (From Starzl et al. 1993a)




Table 1
Historical milestones of liver transplantation




































































































Year

Description

Citation

1955

First article in the literature on auxiliary liver transplantation

(From Welch 1955)

1956

First article on orthotopic liver transplantation (Vittorio Staudacher)

(From Busuttil 2012)

1958–1960

Formal research programs on liver replacement at Harvard and Northwestern

(From Starzl 1960; Moore 1960)

1960

Multivisceral transplantation described, the forerunner of composite grafts

(From Starzl 1960, 1962, 1991a)

1963

Development of the azathioprine-prednisone cocktail (kidneys first than livers)

(From Starzl 1963a, 1964, 1969)

1963

First human liver transplantation trial (University of Colorado)

(From Starzl 1963c)

1964

Confirmation of the portal venous blood hepatropic effect; defined the problem of auxiliary liver transplantation

(From Starzl 1964; Marchioro 1965)

1963–1966

Improvements in preservation, in situ and ex vivo

(From Brettschneider 1968a; Marchioro 1963)

1966

Introduction of antilymphocyte globulin (ALG) (kidneys, then livers)

(From Starzl 1967)

1967

First long survival of human liver recipients (1967–1968), treated with azathioprine prednisone, and antilymphocyte globulin

(From Starzl 1968)

1973–1976

Principal portal venous hepatotropic substance identified as insulin

(From Starzl 1973, 1976)

1976

Improved liver preservation (5–8 h) permitting long-distance procurement

(From Wall 1977; Benichou 1977)

1979

Systematic use of arterial and venous grafts for vascular reconstruction

(From Starzl 1979c)

1979

Cyclosporine introduced for kidneys and liver

(From Calne 1979)

1980

Cyclosporine-steroid cocktail introduced for kidneys

(From Starzl 1980)

1980

Cyclosporine-steroid cocktail introduced for livers

(From Starzl 1980, 1981)

1983

Pump-driven venovenous bypass without anticoagulation

(From Denmark 1983; Shaw 1984; Griffith 1985)

1984

Standardization multiple organ procurement techniques

(From Starzl 1984, 1987)

1987

University of Wisconsin (UW) solution for improved preservation

(From Jamieson 1988; Kalayoglu 1988; Todo 1989)

1989

FK-506-steroid immunosuppression

(From Starzl 1989b)

1992

Discovery of chimerism as explanation of hepatic tolerogenicity

(From Starzl 1992, 1993b, c, 1996, 2015)

1992–2014

Maturation of liver transplantation into category of “conventional treatment”

(From Starzl 1989d, e)


A321347_1_En_1_Fig2_HTML.gif


Fig. 2
Three early approaches to liver transplantation. (a) Welch’s auxiliary liver transplantation in a dog. (b) Complete liver replacement in dogs. The fact that the recipient was a dog rather than a human was identifiable only by the multiple lobar appearance of the canine liver. (c) Organs (green) of a multivisceral graft in dogs or humans. Illustration by Jon Coulter, M.A., C.M.I.


Orthotopic Liver Transplantation – Liver replacement (Fig. 2b) was first attempted in dogs in Milan, Italy, by Professor Vittorio Staudacher in 1952. His original report in the Italian journal La Riforma Medica was rescued from obscurity 60 years later by the scholarship of Ron Busuttil and still-surviving members of Staudacher’s original research team (Busuttil et al. 2012). None of Staudacher’s dogs survived operation. Neither this work nor any other mention of liver replacement can be found in Woodruff’s massive compendium of the entire field of transplantation published in 1959 (Woodruff 1960). By this time, however, important independent investigations of liver replacement (orthotopic transplantation) had been completed in dogs. The studies began in the summer of 1958 at Northwestern University in Chicago (Starzl et al. 1960, 1961) and at the Peter Bent Brigham Hospital in Boston (Moore et al. 1959, 1960; McBride et al. 1962).

The Boston effort under the direction of Francis D. Moore was a natural extension of an immunologically oriented commitment to organ transplantation at the Brigham that was focused primarily on the kidney (Moore 1964). In contrast, the Northwestern initiative stemmed from questions about the functional interrelationships of the pancreas and the liver (Meyer and Starzl 1959a, b; Starzl 1992a). These ultimately led to a new field called hepatotrophic physiology (Starzl et al. 1973, 1983). To facilitate the metabolic investigations, a new technique of total hepatectomy was developed (Starzl et al. 1959). In July 1958, the second step of inserting an allograft into the vacated hepatic fossa was taken. From the outset, there was evidence that portal venous blood had superior liver-supporting qualities relative to systemic venous blood (Starzl et al. 1960, 1961). However, almost 20 years passed before the principal portal hepatotrophic factor was shown to be insulin.

Despite the absence of effective immunosuppression at that time, a solid basis for the future clinical use of orthotopic liver transplantation was laid throughout 1958 and 1959. At the April 1960 meeting of the American Surgical Association, Moore reported 31 canine experiments with 7 survivors of 4–12 days (Moore et al. 1960). In a published discussion of this paper, Starzl described his experience with more than 80 canine liver transplantations at Northwestern University (Starzl 1960); 18 of these animals had lived 4 to 20-1/2 days (Starzl et al 1960, 1961). In both the Boston and Chicago series, rejection was present after 5–6 days and was usually the principal explanation for death. A few years later, Groth et al. (1968) demonstrated that a drastic reduction in hepatic blood flow was an integral part of the rejection process. The consequent ischemia made the liver a target for infection (Brettschneider et al. 1968b; Starzl 1969b).

Preservation of the transplanted liver was accomplished in experiments with intraportal infusion of chilled electrolyte solutions in much the same way as is practiced clinically today (Starzl et al. 1960, 1961). Improved infusates in the succeeding years (Wall et al. 1977; Benichou et al. 1977) eventually replaced the original lactated Ringer’s and saline solutions. Until 1987, however, the safe preservation time for human hepatic allografts was only 5–6 h. Since then, the University of Wisconsin solution (Jamieson et al. 1988) and other solutions have permitted reliable and safe refrigeration of human livers for 18–24 h (Kalayoglu et al. 1988; Todo et al. 1989).

In dogs, survival during recipient hepatectomy and installation of the transplanted liver (Starzl et al. 1960; Moore et al. 1960) required the use of external venous bypasses that passively redirected blood from the occluded splanchnic and systemic venous beds to the superior vena cava. Such venous decompression was later shown to be expendable in dogs submitted to common bile duct ligation several weeks in advance of liver replacement. The obvious safety factor was the development of venous collaterals secondary to the biliary obstruction through which the blocked portal blood could be decompressed (Picache et al. 1970).

It ultimately was recognized that venovenous bypasses were not absolutely essential in most human liver recipients who had chronic liver disease provided the transplants were done by experienced surgeons (Starzl et al. 1982). Nevertheless, the introduction of pump-driven venovenous bypasses in the 1980s (Fig. 3), first with (Starzl et al. 1982; Cutropia et al. 1972) and then without (Denmark et al. 1983; Shaw et al. 1984; Griffith et al. 1985) anticoagulation, made human liver transplantation a less stressful operation and placed it well within the grasp of most competent general and vascular surgeons (Starzl et al. 1989d, e).

A321347_1_En_1_Fig3_HTML.gif


Fig. 3
Pump-driven venovenous bypass, which allows decompression of the splanchnic and systemic venous beds without the need for heparinization



Intestine-Only Model


Alexis Carrel (later working with C.C. Guthrie) was the first to describe canine intestinal transplantation (Carrel 1902). Three quarters of a century passed before Richard Lillehei and his coworkers replaced almost the entire small intestine in unmodified dogs after immersing the graft in iced saline for preservation (Lillehei et al. 1959). The clinical application of intestinal transplantation languished even after it was demonstrated in Toronto (Craddock et al. 1983), London (Ontario) (Grant et al. 1988), and Pittsburgh (Diliz-Perez et al. 1984) that the gut could be successfully replaced in animals under long-term immunosuppression. Isolated examples of successful human intestinal transplantation were not accomplished until the late 1980s (Deltz et al. 1986; Ricour et al. 1983; Goulet et al. 1992; Todo et al. 1992).


Liver Plus Intestine Combinations


At the same time as isolated canine liver transplantation was perfected in 1959, the more radical procedure of multiple organ engraftment (including the liver) was shown to be feasible (Starzl and Kaupp 1960; Starzl et al. 1962) (Fig. 2c). This multivisceral allograft was viewed as a grape cluster with a double arterial stem consisting of the celiac axis and superior mesenteric artery (Fig. 1, center). In clinical variations of the operation used nearly 30 years later, the grapes, or individual organs, were removed or retained according to the surgical objectives (Fig. 1, periphery). Both sources of arterial blood were always preserved if possible (Starzl et al. 1991a).

Observations in the original canine multivisceral experiments of 1959 have been verified in human recipients. First, rejection of the organs making up the composite graft is less severe than after transplantation of the individual organs alone (Starzl et al. 1962). In 1969, Calne and colleagues (1969) confirmed and extended this principle in pig experiments showing that kidney and skin grafts were protected from rejection by a cotransplanted liver. The hepatic protective effect also has been confirmed in rats (Kamada 1985) by the Japanese surgeon Naoshi Kamada and by many others. Most recently, Valdivia et al. (1993) demonstrated the cross-species protection of hamster heart and skin xenografts in rats by the simultaneous or prior xenotransplantation of a hamster liver.


The Risk of Graft-Versus-Host Disease


The specter of graft-versus-host disease (GVHD) was raised by the transplantation of multivisceral grafts. The features of GVHD had been described by Billingham and Brent (1956) and Trentin (1956) as early as 1956. However, their observations had been almost exclusively based on bone marrow or splenocyte (not whole organ) transplantation. Histopathological evidence of GVHD was found in canine multivisceral recipients of 1959 (Starzl et al. 1962) but without physiological manifestations.

By 1965, however, it was realized that the classical GVHD defined by Billingham and Brent could be caused either by the liver or by the intestine. In addition, a humoral variety of GVHD was typified by hemolysis, first in canine liver recipients (Starzl et al. 1965) and later in humans (Ramsey et al. 1984). Although GVHD posed an obvious threat to human intestinal or multivisceral recipients, studies by Monchik and Russell (1971) in mice greatly overestimated this risk. The first example of long survival (>6 months) of a functioning human intestinal graft was provided by a multivisceral recipient (Starzl et al. 1989a). The fact that this child had no evidence of GVHD at any post-transplant time provided a strong incentive to move forward with the development of the Pittsburgh Intestinal Transplantation Program.


The Pancreatic and Other Hepatotrophic Factors


Transplantation of the pancreas alone (Houssay 1929; DeJode and Howard 1962; Idezuki et al. 1968; Kelly et al. 1967) will not be considered in these historical notes because this procedure is performed clinically only for endocrine objectives. However, the importance of first-pass delivery of endogenous insulin to the liver is a vital concern in the design of all liver transplant procedures and of all pancreas transplant operations.

Welch’s belief that rejection of his auxiliary canine liver grafts (Welch 1955; Goodrich et al. 1956) was the explanation for their rapid atrophy (see earlier) was based on the long-standing belief that the source of portal venous blood was of no importance in the maintenance of “liver health” (Mann 1944; Child et al. 1953; Fisher et al. 1954; Bollman 1961). Although Welch’s view could not have been more wrong, he had unwittingly created an experimental model of great power, the principle of which was the coexistence in the same animal of competing livers (Starzl et al. 1964, 1973; Marchioro et al. 1965, 1967).

The competing liver principle was applied in nontransplant models by simply dividing the dog’s own liver into two parts, each of which was vascularized with portal venous inflow from different regions of the body (Marchioro et al. 1967; Starzl et al. 1973; Putnam et al. 1976) (Figs. 4 and 5). The key observation was that the liver fragment supplied with normal portal blood (see Fig. 4) flourished while the fragment given equal or greater quantities of substitute venous blood underwent acute atrophy. With a variety of double liver models (Figs. 4 and 5) the source of the hepatotrophic substances were localized first to the upper abdominal viscera and ultimately to the pancreas. Insulin and other hepatotrophic molecules were removed so completely with a single pass through the hepatic sinusoidal bed that little or none was left for the competing fragment. The deprived hepatocytes underwent dramatic atrophy within 4 days (Fig. 6). In crucial experiments, insulin when infused continuously into the tied-off portal vein after portacaval shunt (Fig. 7) prevented most of the atrophy and other adverse consequences to the liver caused by portal blood deprivation (Starzl et al. 1976, 1979a; Francavilla 1991).

A321347_1_En_1_Fig4_HTML.gif


Fig. 4
The operation of partial (split) transposition in dogs. Note that one of the main portal veins (left in a. right in b ) retains the natural splanchnic flow and that the other one receives the total input of the suprarenal inferior vena cava. RV renal vein (a and b from Marchioro et al. 1967)


A321347_1_En_1_Fig5_HTML.gif


Fig. 5
Splanchnic division experiments. In these dogs. the right liver lobes received venous return from the pancreaticogastroduodenosplenic region. and the left liver lobes received venous blood from the intestines. (a). Nondiabetic dogs. (b). Alloxan-induced diabetic dogs. (c). Dogs with total pancreatectomy (ac from Starzl et al. (1975a). By permission of Surgery. Gynecology and Obstetrics)


A321347_1_En_1_Fig6_HTML.gif


Fig. 6
Hepatocyte shadows traced during histopathological examination of liver biopsy specimens from the experiments shown in Figs. 4 and 5. These tracings were later cut out on standard paper and weighed as an index of hepatocyte size. The lobes with the large hepatic cells received venous blood from the pancreas, stomach, duodenum, and spleen. The relatively shrunken left lobes. with the small hepatocytes. received intestinal blood (From Starzl et al. (1973). By permission of Surgery Gynecology and Obstetrics)


A321347_1_En_1_Fig7_HTML.gif


Fig. 7
Experiments in which postoperative infusions of insulin or other candidate hepatotrophic molecules are made into the left portal vein after performance of Eck’s fistula (From Starzl et al. (1976), © by The Lancet Ltd, 1976)

Insulin was, in fact, only the first member to be identified of a diverse family of eight molecules, all others of which perfectly mimicked the hepatotrophic effects of insulin (Table 2) (Francavilla et al. 1994a). Although none of these “hepatotrophic factors” enhanced hepatocyte proliferation when infused into intact animals, all eight augmented preexisting hyperplasia. The second of these eight factors to be discovered, then called hepatic stimulatory substance (HSS), was demonstrated in 1979 in a cytosolic extract from regenerating dog livers (Starzl et al. 1979a) and later renamed “augmenter of liver regeneration” (ALR) (Francavilla et al. 1994a).


Table 2
Hepatotrophic/anti-hepatotrophic factors (by 1994)









































Hepatotrophic

Hormones:

Insulin

“Hepatic Growth Factors:

Augmenter of liver regeneration (ALR)

Insulin-like growth factor II (IFG-II)

Transforming growth factor α (TGF-α)

Hepatocyte growth factor (HFG)

Immunosuppressants:

Cyclosporine

Tacrolimus

Immunophilins:

FK binding protein12 (FKBP12)

Anti-hepatotrophic

Growth factors:

Transforming grown factorβ (TGFβ)

Immunosuppressant:

Rapamycin


From: Francavilla et al. (1994a)

After a 14-year search for the identity of ALR, its molecular structure and expression in the rat, mouse, and humans were elucidated (Hagiya et al. 1994). The mammalian DNA of ALR has 40–50 % homology with the dual function nuclear gene scERV1 of baker’s yeast (Saccharomyces cerevisiae) (Giorda et al. 1996). The gene provides part of the mitochondrial respiratory chain of yeast and also plays a critical role in cell replication. In the mouse, knockout of the ERV1 gene during embryogenesis is mutant-lethal. However, a study of mice with liver-specific conditional deletion of ALR showed that this peptide is required for mitochondrial function and for liver-dependent lipid homeostasis (Gandhi et al. 2015).

In addition to the diverse family of eight hepatotrophic factors, two molecules with specific antihepatotrophic qualities were identified (Table 2): transforming growth factor β, and the immunosuppressant rapamycin (Francavilla et al. 1994a). These discoveries expanded hepatotrophic physiology into multiple research areas of metabolism and regenerative medicine. The laboratory research had immediate clinical implications.

With the demonstration that portal diversion severely damages the liver, human portacaval shunt for the treatment of complications of portal hypertension was greatly reduced. However, a new use emerged. The degraded liver function caused by the procedure was used to palliate human glycogen, cholesterol, or alpha-1-antitrypsin storage diseases. In turn, such palliation identified heritable storage disorders that could be effectively treated with liver replacement (Starzl and Fung 2010).

Another dimension of hepatotrophic physiology was the liver regeneration that follows partial hepatectomy. No matter how much is taken out, the portion of liver that remains is restored to the original size within 3 weeks in humans, and far more rapidly in animals. In transplant-specific studies, it was shown in rodents (Francavilla et al. 1994b), dogs (Kam et al. 1987), and ultimately humans that a “small (or large) for recipient size” liver allograft promptly normalizes its volume to that appropriate for the individual recipient. What initiates this liver regrowth, or alternatively down sizes the liver, and then stops the adjustment at just the right volume has been a question of “hepatotrophic physiology” that has remained enigmatic.


Immunosuppression


After the demonstration by Medawar in 1944 that rejection is an immunological event (Medawar 1944, 1945), the deliberate weakening of the immune system was shown to ameliorate the rejection of skin grafts in rodents and renal grafts in dogs. Such immunosuppression was accomplished in animals with total body irradiation (Dempster et al. 1950), adrenal corticosteroids (Billingham et al. 1951; Morgan 1951), and much later the thiopurine compounds 6-mercaptopurine and azathioprine (Meeker et al. 1959; Schwartz and Dameshek 1960; Calne 1960; Zukoski et al. 1960; Calne and Murray 1961). However, the avoidance of rejection with a single modality was rarely achieved without lethal side effects (Murray et al. 1960, 1962, 1963; Woodruff et al. 1963; Goodwin and Martin 1963; Groth 1972; Hamburger et al. 1962; Kuss et al. 1962).

This discouraging picture changed dramatically during 1962 and 1963 at the University of Colorado, where the synergism of properly timed azathioprine and prednisone was discovered in animal investigations (Marchioro et al. 1964). When these two drugs were used together in Denver to treat human kidney transplant recipients (Starzl et al. 1963a; Starzl 1964), the results precipitated a revolution in clinical transplantation. The key observations were that organ rejection could usually be reversed with prednisone and then that the amount of drugs required often lessened with time (Starzl et al. 1963a, 1990; Starzl 1964; Hume et al. 1963).

The reversibility of kidney rejection and an apparent but unexplained change in host–graft relationship were eventually verified with all other transplanted organs, beginning with the liver (Starzl et al. 1965; Starzl 1969c). Although immunosuppression has improved, the central therapeutic strategy for whole organ transplantation that had emerged by 1963 (Starzl et al. 1963a; Starzl 1964) has changed very little in over 30 years. The strategy calls for daily treatment with one or two baseline drugs and further immunomodulation with the highly dose-maneuverable adrenocortical steroids (or other secondary or tertiary agents) to whatever level is required to maintain stable graft function. Every organ recipient goes through a trial and potential error algorithmic experience as drug dosages are modified to achieve the desired maintenance levels. The principal baseline drugs used clinically with this format have been azathioprine, cyclosporine, and tacrolimus (Starzl 1964, 1969c, d; Starzl et al. 1963b, 1967, 1971, 1979b, 1980, 1989b, 1990; Hume et al. 1963; Franksson 1984; Strober et al. 1979; Najarian et al. 1982; Calne et al. 1979).


Clinical Liver Transplantation


Phase I: The Failed First Cases – Once the effectiveness of the azathioprine-prednisone cocktail for kidney grafting had been established, a decision was taken at the University of Colorado to move on to the liver (Starzl et al. 1963c; Starzl 1992b). The first recipient was a 3-year-old boy with biliary atresia who had had multiple previous operations. The transplantation could not be completed because of a fatal hemorrhage from venous collaterals and an uncontrollable coagulopathy (prothrombin time infinity, platelet count <10,000/mm3). Even for a team that had been fully prepared for technical vicissitudes by hundreds of animal operations, the exsanguination of this child was a terrible shock.

Two more liver transplantations were carried out in the next 4 months. In both, the procedures seemed satisfactory, but the recipients died after 22 and 7 days, respectively (Starzl et al. 1963c; Starzl 1992b). Promotion of coagulation (fresh blood or blood products and E-aminocaproic acid to treat fibrinolysis) had a delayed backfire. During the time when the livers were sewn in, the plastic external bypasses were used to reroute venous blood around the area of the liver in the same way as had been worked out in dogs.

Clots formed in the bypass tubing and passed to the lungs of recipients. Abscesses and other lung damage contributed to or caused delayed death in all four of these patients (Starzl et al. 1963c, 1964). By this time, isolated attempts at liver replacement made in Boston (Moore et al. 1964) and Paris (Demirleau et al. 1964) had also been unsuccessful (Table 3). A pall-settled over liver transplantation and a self-imposed moratorium followed that lasted more than 3 years.


Table 3
The first seven attempts of clinical orthotopic liver transplantation



































































Number

Location

Age (years)

Disease

Survival (days)

Main Cause of Death

1

Denver, (From Starzl et al. 1963c)

3

Extrahepatic bilary atresia

0

Hemorrhage

2

Denver, (From Starzl et al. 1963c)

48

Hepatocellular cancer, cirrhosis

22

Pulmonary emboli, sepsis

3

Denver, (From Starzl et al. 1963c)

68

Duct cell carcinoma

7-1/2

Sepsis, pulmonary emboli, gastrointestinal bleeding

4

Denver, (From Starzl et al. 1964)

52

Hepatocellular cancer, cirrhosis

6-1/2

Pulmonary emboli, hepatic failure, pulmonary edema

5

Boston, (From Moore et al. 1964)

58

Metastatic colon carcinoma

11

Pneumonitis, liver abscesses, hepatic failure

6

Denver, (From Starzl et al. 1964)

29

Hepatocellular cancer, cirrhosis

23

Sepsis, bile peritonitis, hepatic failure

7

Paris, (From Demirleau et al. 1964)

75

Metastatic colon carcinoma

0

Hemorrhage

Pessimism prevailed worldwide. The operation of liver replacement seemed too difficult to allow its practical application. In addition, the methods of preservation were assumed to be inadequate for an organ so seemingly sensitive to ischemic damage. Researchers began to ask whether the available immunosuppression was too primitive to permit success. This possibility was reinforced by the fact that truly long-term survival after liver replacement (i.e., measured in years) had not yet been achieved in experimental animals.

Phase 2: Feasible but Impractical – By the summer of 1967, these deficiencies had been at least partially rectified by 3 more years of laboratory effort. Many long-term canine survivors had been obtained (Starzl et al. 1965), and some dogs had passed the 3-year postoperative mark (Fig. 8). Better immunosuppression with the so-called triple-drug therapy was available since the development and first-ever clinical trials of antilymphocyte globulin (ALG). The ALG was prepared from the serum of sensitized horses (Starzl et al. 1967) and used to supplement azathioprine and prednisone in renal recipients. Finally, techniques of organ preservation for as long as a day had been developed (Starzl 1992c; Brettschneider et al. 1968).

A321347_1_En_1_Fig8_HTML.jpg


Fig. 8
Photograph (1968) of a dog whose orthotopic liver transplantation had been carried out in the spring of 1964. The animal, who was treated with azathioprine for only 100 days, died of old age after 112/3 postoperative years. This was the first example of “hepatic tolerogenicity”

On July 23, 1967, a 1-1/2 -year-old child with a huge hepatoma was restored almost immediately from a moribund state to seemingly good health after liver replacement. More cases followed. Most of the attempts made in 1967 and 1968 were initially successful, but all of the patients eventually died. The first long-term survivor succumbed to recurrent cancer after 400 days. The maximum survival of the other six long-surviving liver recipients treated between July 1967 and March 1968 was 2-1/2 years (Starzl et al. 1968, 1982; Starzl 1992d).

For the next 12 years, the I-year mortality rate after liver transplantation never fell below 50 % in cases that were accrued at the University of Colorado at the rate of about 1 per month. The losses were concentrated in the first postoperative months; after this initial period, the life survival curve flattened, leaving a residual group of stable and remarkably healthy survivors (Fig. 9). Of the first 170 patients in the consecutive series that started March 1, 1963, and ended in December 1979, 30 (18 %) lived more than 10 years; 23 remained alive after 13–23 years. All were treated with azathioprine (or the anticancer agent cyclophosphamide), prednisone, and polyclonal antilymphocyte globulin (Starzl et al. 1982).

A321347_1_En_1_Fig9_HTML.gif


Fig. 9
Patient survival during the successive eras in which the baseline immunosuppressant was azathioprine (bottom curve), cyclosporine (middle), and tacrolimus (upper)

In the meantime, Roy Calne of Cambridge University in England began clinical trials of liver transplantation on May 23, 1967. As had been experienced earlier, his first patient exsanguinated (Calne and Williams 1968). A few months later, Calne formed a collaboration that endured for more than two decades with the hepatologist Roger Williams at King’s College Hospital in London. The extended survival of patients in both the Colorado and Cambridge-London series was a testimonial for liver transplantation. It was asked increasingly on both sides of the Atlantic, however, if such a small dividend could justify the prodigious effort that had brought liver transplantation this far (Starzl 1992e).

Other teams organized in Hannover (Rudolf Pichlmayr 1972), Paris (Henri Bismuth 1974), and Gronigen (Rudi Krom) also reported the nearly miraculous benefits of liver transplantation when this treatment was successful but always with the notation that the mortality rate was too high to allow its practical use. Liver transplantation remained a feasible but impractical operation.

Phase 3: The Cyclosporine and FK506 Eras – The frustration ended when cyclosporine became available for clinical use in 1979 (Calne et al. 1979) and was combined with prednisone or lymphoid depletion in the first of the cyclosporine-based cocktails (Starzl et al. 1980) (Fig. 9). Of the first 12 liver recipients treated with cyclosporine and prednisone in the first 8 months of 1980, 11 lived for more than a year (Starzl et al. 1981), and 7 were still alive more than a dozen years later. As the news was confirmed that a 1-year patient survival rate of at least 70 % was readily achievable, new liver programs proliferated worldwide.

When FK506 was substituted for cyclosporine in 1989 (Starzl et al. 1989b), the 1-year patient and liver graft survival rate rose again in the Pittsburgh experience (Todo et al. 1990) (Fig. 9), an improvement similar to that in a multicenter European trial. By this time, liver transplantation had become the accepted court of last appeal for almost all non-neoplastic liver disease and even for selected patients with otherwise nonresectable hepatic malignancies. The principal limitation of the technology quickly became the small supply of organs to meet the burgeoning need.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Aug 23, 2017 | Posted by in ABDOMINAL MEDICINE | Comments Off on History of Liver and Other Splanchnic Organ Transplantation

Full access? Get Clinical Tree

Get Clinical Tree app for offline access