div class=”ChapterContextInformation”>
5. Prognostic Implications of Physical Frailty and Sarcopenia Pre and Post Transplantation
Keywords
Liver transplantationPrognosisFrailtySarcopeniaCirrhosisIntroduction
In the past 60 years, liver transplantation has become increasingly routine for an increasing number of indications [1]. The main indication in the non-acute setting remains cirrhosis caused by alcoholic liver disease [1]. However, every year a larger proportion of patients develop cirrhosis due to metabolic factors other than alcohol consumption, as a consequence of nonalcoholic steatohepatitis (NASH) [2, 3]. The median age at which this condition develops is usually higher than for classical alcohol or viral hepatitis-induced cirrhosis [4]. Causes for NASH include health disorders such as type II diabetes mellitus, metabolic syndrome, cardiovascular diseases, and renal function impairment [3]. Many of these risk factors are also involved in the causal chain leading to frailty and sarcopenia. Concepts such as “inflammaging” have recently been introduced to describe the conditions in which these diseases of the elderly occur, mostly observing higher levels of pro-inflammatory markers in blood and other tissues [5].
The development of frailty and sarcopenia has many shared risk factors with the occurrence of cirrhosis. Diabetes mellitus and metabolic syndrome are long-known causes for sarcopenia and frailty. Sequelae of diabetes, combined with other factors such as cardiovascular disease, have been shown to explain up to 46% of the variance in muscle quality [6]. Diabetes mellitus and metabolic syndrome are also the main risk factors for developing nonalcoholic liver disease and correlate with the risk and degree of progression of liver disease [7].
Different definitions of frailty and sarcopenia exist and are in use in current literature [8, 9]. Whereas frailty is, by definition, observed in elderly patients, secondary sarcopenia may also be observed in younger patients with chronic diseases such as cirrhosis [10]. Most criteria defining frailty and sarcopenia are inherent to cirrhosis.
The leading model for defining frailty is the phenotype model, revolving around five factors: weight loss, exhaustion, low activity, slowness, and weakness [11]. The liver has a central position in many metabolism functions [12]. Therefore, loss of liver function directly results in weight loss, only partially compensated by ascites [12, 13]. The loss of nutrient uptake, combined with chronic inflammation processes, has a large toll on patient condition and usually results in exhaustion and dependency in activities of daily living [14, 15].
As previously mentioned, sarcopenia is a condition also observed in younger patients, and its definitions are focused on skeletal muscle mass and function [10]. A causal link between chronic liver disease and sarcopenia has been suggested in murine models [16]. The pathways involved in developing sarcopenia have partially been elucidated and include an upregulation of the ubiquitin-proteasome system and oxidative stress [17, 18]. In clinical studies, NASH and the severity of cirrhosis objectified by the MELD (Model for End-Stage Liver Disease) score were observed to be correlated with sarcopenic obesity, while alcoholic liver disease was associated with sarcopenia [19]. There are indications that sarcopenia, in turn, aggravates insulin resistance and dysglycemia, thus increasing the chance of developing NASH and NASH-induced cirrhosis [20].
While frailty and sarcopenia are linked syndromes, their prevalence and effect on disease course may differ across different etiologies of cirrhosis [21]. In an analysis comparing alcoholic liver disease with NASH, frailty was more prevalent in NASH patients, while sarcopenia was observed more frequently in alcoholic liver disease [21]. The results also suggested that frailty had a larger impact in NASH patients, while sarcopenia was more prognostic in alcoholic liver disease patients [21].
Frailty, Sarcopenia, and Prognosis in Cirrhosis
During the last decade, numerous studies reported the impact of sarcopenia and frailty in liver transplant candidates [8]. However, there is a great variability in the metrics used to define the two syndromes. For frailty, several definitions have been assessed in patients with cirrhosis in recent studies [22]. The self-reported Fried Frailty Criteria, consisting of gait speed, exhaustion, physical activity, unintentional weight loss, and weakness, was prospectively validated for patients with cirrhosis [23]. In the clinical frailty score, patients are categorized by their physician [24]. Finally, the short physical performance battery consists of a series of short physical exercises, while the 6-minute walk test is a measure of maximum walking distance within 6 minutes [25, 26]. While there seems to be consensus that frailty is an important prognostic factor in cirrhosis patients, a definitive method for diagnosing frailty is still unavailable [22].
As described in a recent review, the exact utilized definitions of sarcopenia also greatly differ across studies [9]. Determining muscle wasting solely based on imaging has been popular because it allows researchers to include patients retrospectively. Metrics such as walking distance and grip strength, though part of most sarcopenia definitions, have in general been neglected. This simpler definition of sarcopenia is often qualified as low skeletal muscle mass [9]. For determining sarcopenia on imaging, several methods enjoy popularity. There is a general distinction between single muscle measurement, as an indicator of general muscle status, and cross-sectional measurement of muscle area on a certain anatomical level [27]. When the choice of which muscles to measure has been made, different cutoffs are commonly used to define low skeletal muscle mass. Methods vary from the lower percentiles of the included cohort to pre-defined criteria. Nevertheless, it remains to be seen which parameter of sarcopenia, skeletal muscle mass or function, has the best predictive value [28].
Many research groups focus on different CT-based measurements. Psoas muscle measurement strategies have consistently proven their correlation with outcomes in cirrhosis patients, mostly advocating the simplicity of measuring a single muscle [9, 16, 27]. Nonetheless, in recent years, a movement toward the use of the cross-sectional muscle area at the level of the 3rd.lumbar (L3) vertebra has been observed, because of its demonstrated correlation with whole body muscle mass as calculated by the golden standard dual-energy X-ray absorptiometry (DXA) [29, 30]. Often the cross-sectional muscle area at the level of L3 is normalized for the patient’s squared height and called skeletal muscle index (SMI) [9, 29, 31]. The correlation between SMI and single muscle measurements appears to be disappointing in general. Moreover, expert groups consider no single muscle representative of the whole body muscle mass [27, 32, 33]. For example, data from a recent Korean study demonstrates a poor correlation between psoas muscle thickness/height and SMI, with a Pearson’s r of 0.5 [16]. The discrepancy between single muscle measurements and SMI is further illustrated by a study specifically focusing on the value of PMI and SMI for predicting mortality in 353 cirrhosis patients [34]. Up to 66% of patients with low skeletal muscle mass might be misclassified using psoas muscle only [34]. A recent study further examined the components of the SMI and concluded that the paraspinal muscles, rather than the abdominal wall muscles, may be correlated with complications and death in cirrhosis patients [35]. Although these results will have to be validated in other cohorts, it proves that a definitive method to estimate muscle wasting, short of automatically measuring whole body muscle mass, is yet to be determined.
Although SMI is the most accepted measurement method among experts, no such consensus exists on the exact definition of low skeletal muscle mass [36]. One of the defining symptoms of cirrhosis is ascites, making weight- or BMI-based cutoffs that are sometimes used in cancer patients less obvious solutions [37, 38]. Sex-specific cutoffs for use specifically in patient on the transplant waitlist for end-stage liver disease have been proposed by Carey and colleagues [36]. These are <50 cm2/m2 for men and <39 cm2/m2 for women, irrespective of weight or body mass, and have been validated with mixed results [38, 39]. Interestingly, in an analysis of Dutch waitlist patients, we found that the BMI-based sex-specific cutoffs as defined by Martin were more predictive of waitlist mortality, than the Carey cutoffs [38]. Finding an ideal definition of low skeletal muscle mass and sarcopenia remains challenging, due to large differences in population with regard to both body composition and disease characteristics, in particular between Western and Eastern patients [39].
Factors in the causal pathway leading to sarcopenia and frailty have also been associated with poor waitlist outcomes. For example, a recent study described the association between low testosterone levels and sarcopenia in cirrhotic patients [40]. Corrected for a number of confounders, including sex, BMI, MELD score, and reason of cirrhosis, testosterone level proved the most important predictor of sarcopenia [40]. Apart from its predictive value, associations such as these might inform preventive treatment with testosterone to improve outcomes in patients with cirrhosis [40].
The Impact of Frailty and Sarcopenia on Waitlist Outcomes for Liver Transplantation
Drawbacks of the MELD Score
Currently, in the Netherlands and other Eurotransplant countries, the MELD score is employed to prioritize patients for liver transplantation [41]. Although the MELD score strongly predicts waitlist mortality, it inaccurately predicts survival in 15 to 20% of patients due to underestimation of disease severity [42]. One of the frequently mentioned drawbacks of the MELD score is the lack of objective parameters reflecting physical and nutritional status of patients. This led to the development of, for example, the MELDNa and five-variable MELD scores, incorporating sodium and albumin levels, respectively [43–45].
Frailty as Prognostic Factor Independent of the MELD Score
Since robust measurements of frailty and sarcopenia have been developed last decade, interest in these measures as prognostic markers in liver transplant candidates increased concomitantly. Consequently, a wide range of frailty measures has been investigated in liver transplant candidates. The earlier mentioned Fried Frailty Instrument, ranging from 0 to 5 points, was developed to identify vulnerable elders at risk for death, long-term institutionalization, and post surgical complications [46]. Using a cutoff of 3 for the Fried Frailty Instrument, 17% of outpatients listed for liver transplantation (MELD score ≥12) were considered frail. Frail patients had a higher MELD score and significantly higher rates of mild/moderate ascites and moderate hepatic encephalopathy. Waitlist mortality was significantly higher in frail patients (22% versus 10%, p = 0.03), and a 1-unit increase in the Fried Frailty scale was associated with 50% waitlist mortality risk, independent of MELD score [23]. Furthermore, hospitalization during waitlist was significantly higher in these patients [47]. Within the same study, other measures of frailty were also assessed. Thirty-one percent of the patients scored low on the short physical performance battery (SPPB), a combination of repeated chair stands, balance testing, and a 13 foot walk [48], 24% had difficulty with at least one activity of daily living (ADL ; self-reported daily self-care activities [49]), and 43% scored positive on the instrumental activities of daily living (IADL ; self-reported activities that allow an individual to live independently [50]). Patients who died on the waitlist or were delisted had higher rates of frailty, showed higher inactivity rates and higher functional impairment rates, as assessed with the SPPB. The SPPB was independently associated with waitlist mortality with an HR of 1.20, whereas ADL and IADL were not [23].
Although clinical assessment (the “eyeball” test) is subjective and differs per physician, one can accurately predict patients at increased risk for waitlist mortality independent of MELD score [51]. However, a recent study showed that addition of the Liver Frailty Index (LFI) significantly improved the ability to predict waitlist mortality with a reclassification of 34% [52]. Nevertheless, it should be mentioned that a gap exists between clinically assessed physical performance and objective physical activity in liver transplant candidates, a population known for low activity levels [53].
Sarcopenia as Prognostic Factor Independent of the MELD Score
Hitherto, only one meta-analysis pooling the data on the association between sarcopenia and waitlist mortality has been performed. It showed a pooled HR of 1.72 (95% CI 0.99–3.00, p = 0.050) with low heterogeneity (I2 = 33%). However, the evidence is limited as only few studies could be pooled due to the great variety in methodology to measure skeletal muscle mass. Furthermore, data of three out of the four studies that could be pooled originated from one center, and only one of these three studies was included in the meta-analysis [9]. A correlation between sarcopenia and hepatic encephalopathy in particular was also described in a meta-analysis of 1795 patients, observing an odds ratio of 2.38 [54].
In a recent study by Idriss and colleagues, the effect of previous bariatric surgery in cirrhotic patients listed for liver transplantation was investigated [55]. Seventy-eight patients who previously had undergone bariatric surgery were compared with a cohort of 156 patients matched by age, MELD score, and underlying liver disease. Almost 1 in every 2 patients (47.4%) had NASH, which was found to lead to a sixfold increased risk of having sarcopenic obesity in an earlier mentioned cohort of 207 American patients listed for liver transplantation [19]. Notably, BMI was comparable between both the bariatric surgery and non-bariatric surgery groups. The rate of delisting or death was significantly higher among patients who had undergone bariatric surgery compared with patients who did not (33.3% versus 10.1%, p = 0.002), and the transplantation rate was significantly lower (48.9% versus 65.2%, p = 0.03). Previous bariatric surgery was independently associated with an increased risk of waitlist death (HR 5.7), which was, however, attenuated by malnutrition. Interestingly, skeletal muscle index (measured on CT) was associated with malnutrition. Furthermore, the skeletal muscle area was significantly lower in the bariatric surgery group, and the prevalence of sarcopenia was significantly higher among delisted patients. Consequently, strict selection in liver transplant patients who previously have undergone bariatric surgery should be opted for.
Incorporating Sarcopenia and Frailty Measures into the MELD Score
Taking into account the results described in the previous paragraph, frail and sarcopenic patients are exposed to an increased risk of waitlist mortality. Indeed, patients with muscle atrophy, which is highly correlated with frailty, malnutrition, and physical impairment, may be underprioritized using the current allocation system [56]. This led to the development of different scores that incorporated skeletal muscle mass into the MELD score. As this subject will be elaborated further in Chap. 13, here we will only mention it briefly.
First, Durand and colleagues developed the MELD-psoas score, using axial and transversal psoas thickness measurements on CT. The discrimination of the MELD-psoas score was superior to that of the MELD score alone, particularly in patients with a low MELD score (i.e., ≤25) [56]. In another study, cross-sectional skeletal muscle mass measurements on CT were used to develop the MELD-sarcopenia score with sarcopenia as a dichotomous variable. Prediction of waitlist mortality significantly improved after this modification of the MELD score [57]. This study was externally validated by our group in a cohort of 585 patients listed for transplantation. The results showed that the discriminative performance of the MELD-sarcopenia score (c-index 0.82) for 3-month mortality was lower than for the MELD score alone (c-index 0.84). However, inclusion of sarcopenia in a model together with MELD score, age, and presence of hepatic encephalopathy improved its discriminative performance to a c-index of 0.85. The independent additive predictive effect of sarcopenia was particularly present in patients with low MELD scores (i.e., ≤15) again [38]. In conclusion, sarcopenia may be a valuable addition to the MELD score to identify patients with an increased mortality risk that otherwise would be missed.
As mentioned above, factors in the causal pathway of sarcopenia development may also be used to enhance prognostic models. A recent study showed that both sarcopenia and low plasma testosterone levels are associated with waitlist mortality in male cirrhotic patients [58] and testosterone supplementation in these patients showed promising results [59]. The predictive value of low testosterone levels may be stronger than sarcopenia and may be a proper alternative to add to the MELD score [58]. Future studies should determine which factors could be added to improve the predictive value of the MELD score.
Following the study of Lai and colleagues showing the association between frailty and mortality in cirrhotic transplant candidates [23], a frailty index for patients with cirrhosis in particular was developed, since measures such as the Fried Frailty Instrument and SPPB were originally developed in community-dwelling elderly without (liver) disease. In 536 patients with cirrhosis with a median MELDNa score of 18 listed for transplantation, performance-based (gait speed, handgrip strength, chair stands, balance) and self-reported measures (unintentional weight loss, exhaustion, physical activity, activities of daily living, instrumental activities of daily living) were performed. Using a subset Cox regression analysis to select measures with the best predictive value, grip strength, chair stands, and balance were selected to form the Liver Frailty Index (LFI). The c-index to predict 3-month waitlist mortality was 0.80 for the MELDNa score and increased to 0.82 when MELDNa was combined with the newly developed LFI. Using the combination of MELDNa and the LFI, 16% of deaths/delistings were correctly reclassified, resulting in a net reclassification index of 19% [60]. Adjusted for disease severity and baseline physical status, physical function significantly declines during the waitlist period in cirrhotic patients. This functional decline, which has also been correlated with a decrease in skeletal muscle mass [61], is an independent predictor for waitlist mortality [62].
The Economic Burden of Frailty and Sarcopenia in Liver Transplant Candidates
The association of frailty and sarcopenia with waitlist mortality is reflected by a significantly higher number of complications requiring hospitalization. Frailty, as measured by gait speed, was an independent risk factor for hospitalization, and a 0.1 m/s gait speed decrease was associated with 22% more hospital days in an American study of Dunn and colleagues. Similar, but non-significant, results were found for handgrip strength. Moreover, an incremental increase in gait speed was associated with a lower number of hospital days and hospital costs during the listing period: 6.2 days ($24,800/year) in patients with a gait speed of 1 m/s, 21.2 days ($84,800/year) in patients with a gait speed of 0.5 m/s, and 40.2 days ($160,800/year) in patients with a gait speed of 0.25 m/s [25]. In a European study investigating the association between low skeletal muscle mass (CT-assessed skeletal muscle index), similar results were found: median total hospital costs in patients with sarcopenia were €11,294 (IQR 3,570–46,469) compared with €6,878 (IQR 1,305–20,683) in patients without sarcopenia (p < 0.001). An incremental increase in skeletal muscle index was independently associated with a decrease in total hospital expenditure (€455 per incremental SMI, 95% CI 11–900, p = 0.045) [63]. Interestingly, both studies were performed in tertiary centers and might therefore underestimate the real costs as patients may have been admitted to the referring hospitals without notice.
The Impact of Frailty and Sarcopenia on Outcomes After Liver Transplantation
Efforts have been made to define the impact of sarcopenia and frailty on post-transplantation disease course, which have led to different conclusions. In the postoperative disease course, there are indications that low SMI and psoas muscle area/index are associated with postoperative complications, most commonly bacterial infection, and longer hospital stay [31, 64–67]. One larger American study examining 1-year postoperative complications found that patients with lower psoas muscle area were 1.4 times more likely to experience a complication and 2.8 times more likely to experience failure to rescue, mortality due to a severe complication [66]. A study specifically examining the subject found a fourfold increased chance of severe infection [65]. Frailty as defined by the Karnofsky score was highly associated with postoperative mortality in an American nationwide survey [68]. Interestingly, infection (15.7%), together with cardiovascular factors (25.4%), technical factors (17.9%), and graft failure (16.7%), presented one of the most reported causes of death [68]. Patients with either a frailty or a sarcopenia syndrome might have a higher chance of bacterial infections, sepsis, and multiple organ failure due to links between these syndromes and impaired immunity [69, 70]. The effect of frailty and sarcopenia on graft failure has, to our knowledge, not been reported in current literature.
The MELD-psoas and the MELD-sarcopenia scores that were developed to predict waitlist mortality have not been validated to predict post transplant survival [56, 57]. As the MELD score was developed to predict waitlist mortality rather than post-transplant mortality [42], the development of different predictive tools for post-transplant mortality is required. Primarily the role of sarcopenia has been examined as a postoperative risk factor in a number of retrospective studies. Sarcopenia was not associated with postoperative long-term survival in two studies measuring SMI and one study measuring psoas muscle area [19, 31, 67]. Contrastingly, psoas muscle area as a continuous variable proved most predictive of postoperative survival in a model correcting for other known risk factors (HR per mm2: 0.27; p < 0.001) [71]. This correlation between psoas muscle area and survival was also observed in later studies [64, 72].
Most studies examining postoperative disease course used preoperative measurements as a prognosticator. As a consequence, little is known of the impact of post-transplant developments in both sarcopenia and frailty status. A longitudinal study indicated that frailty worsens initially post-transplant and that pretransplant frailty, as defined by the Liver Frailty Index, predicts postoperative frailty [73]. The same was true for sarcopenia, defined by SMI [19, 31]. There are reports of some patients that developed postoperative sarcopenia, which was correlated with impaired long-term survival in retrospective studies [74–76].
A drawback caused by the retrospective nature of most included studies examining posttransplantation survival is that only patients fit for transplant were included. Because of this selection bias, the role of frailty and sarcopenia on postoperative outcomes may have been underestimated [77]. Although it seems unlikely that many patients have been excluded based on muscle mass only, the features associated with the frailty phenotype may have had an influence on treatment choices [78].
The Economic Burden of Frailty and Sarcopenia in Transplanted Patients
Although costs in patients with sarcopenia and frailty are shown to be higher in patients awaiting transplantation [25, 63], no studies have been performed in patients actually undergoing transplantation. An exploratory analysis was performed, but no significant differences could be demonstrated in our Dutch study, probably due to its inappropriate design for this purpose [63]. Nevertheless, sarcopenia and frailty are strongly associated with post transplant complications, and results may therefore be extrapolated to the post-transplant situation. After all, increased hospital expenditure in patients with sarcopenia has previously been described in various surgical cohorts [79, 80].
Frailty, Sarcopenia, Depression, and Quality of Life
Besides physical complications and mortality, frailty is also associated with mental impairment. The reported prevalence of depression in 500 end-stage liver disease patients screened for transplantation was 39.4%. Frail patients, as assessed with the Fried Frailty Index, were more likely to experience depression (54% versus 29%, p < 0.001). A proportional increase in depression symptoms was observed with the severity of frailty; most-frail patients were 3.6 times more likely to experience depression compared with least-frail patients. In this study, frailty and depression were strongly correlated, whereas disease severity was not [81]. Another study among 213 patients listed for liver transplantation showed a significant association between the 6-minute walk distance and health-related qualify of life, assessed by short form 36 (SF-36) questionnaires, whereas this association was not found for sarcopenia [28]. These results warrant awareness for depression and quality of life in the consulting room of the transplant physician. Lastly, in a study including 305 cirrhotic outpatients from Canada, a depression prevalence of 18% was found using the Mini-International Neuropsychiatric Interview (MINI). In these patients, lower baseline health-related quality of life and higher frailty scores were observed [82].
Living-Donor Liver Transplantation
Sarcopenia and Frailty in Living-Donor Liver Transplantation
Living-donor liver transplantation (LDLT) was first described in 1987 by Raia and colleagues [83]. Although the procedure was successful (a mother donated part of her liver to her son), the recipient died shortly after the transplant [83]. The first successful transplantation from mother to son was performed by the Australian surgeon Strong [84]. After refinement of the technique, waitlist-related mortality in children greatly decreased [85, 86]. Nevertheless, when possible deceased donor transplantation remains the option of first choice [85]. After all, potential complications in the healthy donors may be avoided [87, 88]. The first adult-to-adult LDLT was performed in Hong Kong in 1997 [89]. Since, LDLT led to a strong increase in available donor organs. Hence, LDLT is particularly performed in Asia nowadays, where a great shortage in organ donors remains.
During the last few years, multiple studies investigated the impact of sarcopenia in patients undergoing LDLT, which were indeed all performed in Asia [90–98]. Although methodology greatly differs between studies, outcomes were highly comparable showing associations between sarcopenia and post transplant complications and mortality. In the one systematic review and meta-analysis regarding the effect of low skeletal muscle mass in liver transplant patients that has been published [9], a HR of 2.78 (95% CI 1.59–4.85, p = 0.0003) with a Z of 3.60, and I2 of 24%, was shown after pooling data of two studies among LDLT patients using the psoas muscle index to quantify skeletal muscle mass [90, 91].
Modification of the MELD Score for LDLT
Comparable with the MELD-sarcopenia score for patients awaiting orthotopic liver transplantation [38, 57], Hamaguchi and colleagues developed a score to predict overall survival after LDLT called the Muscle-MELD score (i.e., post-transplant) [92]. Using Cox regression analysis, the Muscle-MELD score was created using MELD score, myosteatosis [low intramuscular adipose tissue content (IMAC)] and low skeletal muscle mass [low psoas muscle index (PMI)], which were measured on CT images. Low IMAC and low PMI were calculated using receiver operating characteristics (ROC) curves, and the Muscle-MELD score was calculated as follows: MELD score + 27.0 ∗ low IMAC +25.2 ∗ low PMI. Low Muscle-MELD was also defined using ROC curves. The overall survival was significantly higher in patients with high Muscle-MELD compared with patients with low Muscle-MELD. Muscle-MELD more accurately predicted post transplant survival than MELD score alone at 3, 6, and 12 months after transplantation. The independent odds ratio for mortality at 6 months after transplantation was 6.7 (95% CI 3.3–14.7, p < 0.001) for the Muscle-MELD score.
Sarcopenia and Nutrition in LDLT
In a study using bioelectrical impedance analysis (BIA) to measure muscle mass, 21 of 47 patients with sarcopenia and 42 of 77 patients without sarcopenia received perioperative nutritional therapy [98]. Nutritional therapy was started 2 weeks before LDLT and after BIA. It consisted of a nutrient mixture enriched with branched-chain amino acids (BCAAs) or BCAA nutrients as a late evening snack, glutamine-enriched supplementation products, dietary fiber and oligosaccharide three times daily, a lactic fermented beverage containing a Lactobacillus casei strain once a day, and zinc supplementation in patients with low serum zinc levels. Postoperatively, enteral nutrition was started within the first 24 hours via tube jejunostomy with a gradually increasing caloric intake until postoperative day 3 aimed at maximization of caloric intake on day 5. Oral nutrition was started after recovery of swallowing ability, usually around day 5. Within the group of patients with low skeletal muscle mass, patients in the perioperative nutritional therapy group showed a significantly better overall survival compared with those who did not. Both groups had comparable preoperative Child-Pugh and MELD scores. Notably, patients with normal/high skeletal muscle mass did not benefit from perioperative nutrition therapy in terms of overall survival. Preoperative low skeletal muscle mass, as well as perioperative nutritional therapy, was found to be an independent risk factor for mortality after transplantation. The main cause of death was sepsis: 9/20 in patients with sarcopenia and 4/11 patients without sarcopenia.
A pilot study investigated the relationship between plasma amino acid levels, post transplant sepsis, and sarcopenia (based on psoas muscle measurements) in patients undergoing LDLT [97]. Indeed, leucine, isoleucine, and glutamine were significantly lower in patients with sarcopenia compared with patients without sarcopenia. Lower plasma glutamine levels were an independent risk factor for post transplant sepsis (OR 5.4, p = 0.002). Plasma glutamine levels were significantly lower after LDLT compared with before LDLT in patients with sepsis, whereas levels were comparable in patients without sepsis. In both groups without early nutrition, plasma glutamine levels were significantly decreased after LDLT compared with before LDLT. However, when stratified for sarcopenia, plasma glutamine levels were significantly decreased after LDLT independently of early nutrition in the sarcopenia group, whereas no significant decrease was observed in non-sarcopenic patients with early nutrition. Hence, early postoperative nutrition and supplementation of glutamine may prevent post transplant sepsis and mortality. Further research is therefore warranted.
Sarcopenia and Frailty in Pediatric Liver Transplantation
As such the MELD score lacks objective parameters reflecting patients’ physical and nutritional status (i.e., comorbidity and frailty) in adults, so does the Pediatric End-Stage Liver Disease (PELD) score in children. Recently, a study was performed using the five Fried Frailty Criteria (weakness, slowness, shrinkage, exhaustion, and diminished physical activity) [46] in children with compensated chronic liver disease (CCLD) and end-stage liver disease (ESLD) and listed for liver transplantation [99]. The test scores were adjusted for age and sex with a maximum score of 10. It showed that the median frailty score was significantly higher in children with ESLD compared with CCLD (median 5 (IQR 4–7) versus 3 (IQR 2–4), p < 0.001). The area under the curve for the frailty score to differentiate between ESLD and CCLD was 0.83 (95% CI 0.73–0.93), which was similar to the PELD and MELDNa scores. In total, 46% of children with ESLD were considered frail using a cutoff of 5. The frailty scores did not correlate with physicians’ clinical subjective assessments. Although it was not shown that frailty was predictive for worse outcome, this tool may be used in children to identify the most vulnerable.
Only one study investigating the impact of sarcopenia was performed in pediatric liver transplant recipients [100]. In this Canadian study, DXA was used to measure fat mass, fat-free mass, and skeletal muscle mass on various time points in 58 children aged 0.5–17 years. In total, 41% of the children developed sarcopenia after transplantation with a mean age of sarcopenia detection of 7.6 (SD 3.1) years and a mean time from transplantation of 1.2 (SD 1.9) years. Persistence of sarcopenia after transplantation was associated with poorer growth, recurrent hospitalization (total, intensive care unit, emergency, and readmission), and ventilator dependency, but not with graft rejection or corticosteroid therapy.
Liver Transplantation Beyond the Milan Criteria: Is There a Role for Frailty and Sarcopenia Assessment?
The Milan criteria were introduced to select patients with liver cirrhosis and hepatocellular carcinoma (HCC) eligible for liver transplantation in 1996 [101]. These criteria have universally been accepted since. The Milan criteria state that patients are selected for liver transplantation when there is a single lesion smaller than 5 centimeters or when there are up to three lesions smaller than 3 centimeters, without vascular invasion. Patients meeting the Milan criteria have a 5-year survival rate of at least 70% and recurrence incidence of only 10% [102, 103]. However, in recent years, it has been argued that the Milan criteria are too restrictive. Although most patients with HCC beyond the Milan criteria may experience disease recurrence after liver transplantation leading to decreased 5-year survival rates (53.6% versus 73.3%) [103], a number of patients may still benefit and reach 5-year survival rates that are comparable with recipients fulfilling the Milan criteria. The latest proposal to select eligible patients is known as the up-to-seven rule: the sum of the number of nodule(s) and the maximum diameter of the nodule(s) must not exceed the value of seven which resulted in a 5-year overall survival of 70% inpatients transplanted beyond the Milan criteria [103]. Skeletal muscle mass and frailty may be biomarkers to identify patients with HCC who may or may not benefit from liver transplantation beyond the Milan criteria. Indeed, a recent study among 92 patients undergoing LDLT identified sarcopenia (i.e., height-normalized psoas muscle thickness <15.5 mm/m at the level of L3) as a risk factor for tumor recurrence after transplantation using a competing risk analysis (HR 9.5, 95%CI 1.2–76.3, p = 0.034) [104]. Since this was the only study of its kind, its results should be validated.
Interventions
Pretransplant
Prehabilitation in surgical populations increasingly gained interest and priority during recent years [105]. The waitlist period offers a window of opportunity to improve functional status and skeletal muscle mass. The differences in costs between patients with and without frailty and/or sarcopenia justify efforts and the use of resources to seek for therapies and strategies to halt or reverse the processes that make these patients spiraling down a vicious circle [25, 63]. Besides infection control, ascites control, and protection of renal function in liver transplant candidates [106], strategies aimed to reduce sarcopenia and frailty should be multidimensional and, at least, be a combination of nutrition, exercise, and ammonia-lowering therapies with or without novel pharmacological therapy [107]. These will be briefly mentioned here and further elaborated in Chaps. 7 and 8.
Nutrition
Impaired protein synthesis due to hyperammonemia, increased cytokine production and hyperinflammatory state, hormonal abnormalities, direct effects of ethanol, impaired skeletal muscle signaling pathways, and possible splanchnic vasodilatation leading to a hyperdynamic circulation all contribute to the catabolic state of cirrhotic patients [108, 109]. Hence, nutritional supplementation is recommended and ideally guided by indirect calorimetric energy expenditure measurements [110]. However, enteral intake may be hindered by early satiety due to ascites, taste distortions due to zinc deficiency, and encephalopathy. Therefore, nasogastric feeding may be needed to achieve caloric goals [110]. Although evidence is scarce (well-designed clinical trials are lacking) and understanding of the mechanisms involved is poor, some evidence supports the effectiveness of late evening snacks, BCAA supplementation, and high-protein/high-calorie diets [111, 112]. Future studies regarding the effect of nutrition on body composition, physical status, and frailty in cirrhotic patients are highly warranted because current evidence is predominantly of preclinical and experimental [109].
Exercise
Because frailty and sarcopenia are the result of, among others, inactivity and because physical strength, endurance, and balance are significantly associated with frailty [113], exercise interventions obviously may improve physical status, skeletal muscle mass, and frailty. It is recommended to combine exercise with tailored nutritional interventions [114]. Last decade, multiple trials investigating exercise interventions in patients with cirrhosis showed positive results. However, as described by Dunn, “the most formidable challenge to arrest frailty and sarcopenia may be the reluctance of transplant candidates and their caregivers to add another demand, especially exercise, to an already-difficult care regimen” [107].
Pharmacological
Hyperammonemia, leading to impaired skeletal muscle protein synthesis and protein breakdown, plays a key role in skeletal muscle wasting [115]. In experimental animal studies, lowering ammonia by l-ornithine l-aspartate and rifaximin orally for 4 weeks significantly improved lean body mass and grip strength [116]. However, the effect on survival is not known yet and human studies are lacking. Another promising pharmacological intervention to halt cancer cachexia-associated skeletal muscle wasting, which has many overlap with skeletal muscle wasting in cirrhosis, is the inhibition of myostatin by blocking the ActRIIB pathway [117]. However, trials in humans, particularly in those with cirrhosis, are highly warranted. In males with low testosterone levels, testosterone supplementation significantly increases skeletal muscle mass [59]. The effect on survival and complications remains to be investigated.
Cognitive
Since frailty and sarcopenia are associated with depression and decreased quality of life in cirrhotic patients, it seems obvious that these patients may benefit from cognitive and psychological support. Currently, no trials have been performed in liver transplant candidates.
Post transplant
The evidence of post-transplant sarcopenia and its influence on disease outcome remains unknown since studies are heterogeneous with conflicting results [75]. Although some studies showed that skeletal muscle wasting is arrested and frequently improved by liver transplantation, particularly in those with lowest skeletal muscle mass [118], other studies showed that sarcopenia did not restore after transplantation [19]. In a prospective post transplant follow-up of 53 patients (median follow-up time 19.3 months), new-onset post transplant sarcopenia developed in 14 patients (26%). Post transplant skeletal muscle loss was a risk factor for new-onset diabetes mellitus, and a trend towards higher mortality was observed. No correlation with pretransplant characteristics was found [76]. Not only liver transplantation but also transjugular intrahepatic portosystemic shunt (TIPS) creation may lead to skeletal muscle gain with consequent decreased mortality [119].
The previously mentioned study of Lai and colleagues found that the LFI was worse 3 months after transplantation, was comparable at 6 months, and improved 12 months after transplantation compared with pretransplant levels. Pretransplant frailty (i.e., LFI ≤3) was an independent predictor of post transplant robustness (i.e., LFI ≥4.5). Less than 40% of patients, however, reached robustness post transplant [73].
In a study among LDLT patients, skeletal muscle mass worsened after transplantation and did not reach pretransplant levels until 1 year after transplantation. However, grip strength returned to pretransplant levels 6 months after transplantation [95]. Another study showed that trunk muscle mass was successfully restored after LDLT, particularly in patients with lowest skeletal muscle mass [96]. These findings leave room for rehabilitation programs in patients undergoing liver transplantation, and future studies should elucidate its effectiveness.
Ethical Considerations
Due to the general scarcity of liver donors and an increasing number of cirrhosis patients, the ethical aspects of transplantation allocation need to be considered [120]. This is illustrated by the Eurotransplant 2017 report, detailing 2,548 patients on the waitlist in 2017, with 1,674 patients transplanted [121]. Currently, there is no consistent agreement in Western Europe whether the sickest patients or the patients with the most possible health benefit need to be prioritized [120]. Dutch law prescribes a combination of prospect of success and urgency and, only if these factors do not sufficiently differentiate, waiting time period [122]. The principles of urgency and prospect of success are often contradictory, as discussed earlier in this chapter, and this prioritization leaves room for interpretation. The practical approach to these issues is that a combination of waitlist time and MELD score are utilized [41].
The discussion also focuses on the rather high standards patients have to meet in order to be eligible for transplant [120]. In the Netherlands, patients have to be younger than 70 years and cannot have extrahepatic malignancies or multi-organ failure. In addition, the Milan criteria are often applied for transplantation in patients with HCC [101]. The ethical implications of these rather simplistic rules are heavily debated [120]. The case can be made that by focusing on long-term prognosis for inclusion in individual patients, the law, which prescribes the prospect of success and urgency compared to others, is disregarded [120].
The pertinent question in this chapter is whether sarcopenia and frailty metrics need to be entered into the decisions pertaining to organ allocation and eligibility for waitlist placement. In order for the inclusion to be ethical, a number of requirements will have to be met, the most important being discrimination and calibration of the predictions based on sarcopenia and frailty and their applicability to the setting and population in which they are utilized [122]. While there are indications that sarcopenia and frailty impact both waitlist and posttransplantation outcomes, their exact implications are not yet adequately quantified. We therefore believe more, and in particular prospective, research will need to be conducted before these syndromes can be fully considered.
Future Perspectives
Patients with cirrhosis are a vulnerable population, with many risk factors barring successful treatment of their disease. Sarcopenia and physical frailty are two separate yet related syndromes commonly occurring in this patient population and adversely correlated with outcomes. While efforts have been made to quantify the impact of both syndromes, mostly with concordant results, a number of methodological issues and hiatuses in knowledge need to be resolved.
Future trials on prognostication in transplantation patients should be aimed at finding answers to important open questions. One of the objections to the current, primarily retrospectively gathered evidence is that it is unknown whether the impact of frailty and sarcopenia has already been included in the clinical decision-making [77]. Another caveat is that most effects of sarcopenia and frailty described in this chapter are based on association rather than causation. It remains unclear whether sarcopenia and frailty are epiphenomena present in patients in worse clinical condition or whether these syndromes are the cause of the inferior outcomes. This is important for choosing the nature of the intervention: to alleviate the muscle wasting and frailty symptoms or to treat underlying factors.
While mortality in cirrhosis patients appears to be highly correlated with sarcopenia and frailty, a standardized metric of frailty and sarcopenia is required in order to enhance selection of patients and enact interventions [123]. This can be achieved by critically and quantitatively reviewing the current body of evidence, followed by prospective validation efforts. Only then can frailty evaluation (or assessment) be employed to its full potential to improve care in transplantation patients [123, 124].