Introduction
A successful renal transplant is the most effective way of reducing the incidence of cardiovascular disease (CVD) and cardiovascular (CV) mortality in patients with end-stage renal disease (ESRD). Although the risk of CVD in renal transplant recipients (RTR) is approximately three times that of the general population, this is much less than the 10- to 20-fold increase in CV risk in patients receiving maintenance hemodialysis. Premature CVD is also a substantial cause of graft failure, due to “death with a functioning graft.” These observations are well established and illustrated in Fig. 30.1 .
There are specific problems in managing CVD in RTRs compared with the general population. First, RTRs are not a homogeneous group; there is variability in duration and underlying cause of chronic kidney disease (CKD) and dialysis history, which determine the individual burden of accumulated CV risk. Second, the transplant population carries specific risk associated with transplantation surgery and the need for immunosuppressive therapy with agents that have direct and indirect effects on CV risk factors. Third, unlike the general population, in patients with ESRD including RTR, coronary heart disease (CHD) is not the dominant pathophysiologic process. Structural and functional abnormalities in the heart—uremic cardiomyopathy—contribute to an increased risk of heart failure and sudden death; and vascular changes including calcification, together with conventional risk factors, contribute to an increased risk of stroke and peripheral vascular disease. Thus traditional risk factor relationships and therapeutic strategies derived from the general population may not be directly applicable to RTRs. Finally, we are limited in our ability to provide guidelines for the management of CVD in this population because of the lack of epidemiologic and clinical trial data in this patient population.
Background: CVD in CKD
The recognition that CKD is associated with increased CV risk has resulted in the adoption of estimated glomerular filtration rate (eGFR) as a CV risk equivalent. As GFR declines in patients with CKD, the risk of CVD increases progressively, with the highest risk being in ESRD. In early CKD, the pattern of CVD is probably similar to the general population, with an increased risk of lipid-dependent coronary artery disease (CAD). In more advanced CKD, there is a disproportionate increase in deaths due to heart failure and sudden, presumed arrhythmic deaths. The latter pattern is similar to that seen in advanced heart failure, where CHD is uncommon, regardless of the primary etiology of heart failure. In both ESRD and advanced heart failure, total serum cholesterol levels are low, markers of inflammation are elevated, and the conventional relationship between lipids and total CV events (CVE) is lost (or even reversed). Moreover, treatment strategies proven in the general population, specifically statins, have little or no effects in these patient groups, presumably reflecting the low proportion of the overall CV burden which is due to cholesterol-dependent coronary disease.
The determinants of excess CVD in advanced CKD include vascular calcification, elevated phosphate (and fibroblast growth factor-23 [FGF23]), hypertension, inflammation, malnutrition–bone mineral disease, and intravascular volume overload. Vascular stiffness and hypertension, and elevated levels of the phosphaturic hormone FGF23, contribute to the development of uremic cardiomyopathy, the most common form of which is extreme left ventricular hypertrophy (LVH) with fibrosis which, in turn, promotes the development of systolic dysfunction and sudden arrhythmic death. Overall, the pattern of CVD in ESRD, its pathogenesis, and the implications for treatment are markedly different from the general population; although the risk of CHD is increased, the disproportionate increase is in sudden death and death due to heart failure.
Potential RTR recipients carry this burden of accumulated CV risk—conventional and nonconventional—from time spent with CKD and on maintenance hemodialysis to the posttransplant period. After transplantation surgery, there is an abrupt increase in overall (and CV) events and mortality, particularly in the early postoperative period ( Fig. 30.2 ). This falls progressively thereafter, and survivors beyond the first few months have a mortality rate approximately half that of patients receiving maintenance dialysis, and a comparable reduction in nonfatal CV events. After transplantation, several specific CV risk factors are more prevalent. Dyslipidemia and hypertension are both very common, affecting nearly all patients. Lipid levels rise in the weeks after transplantation, reflecting improved well-being, diet, and immunosuppressive agents. Immunosuppressive agents also contribute to hypertension and to the development of diabetes (new-onset diabetes after transplantation [NODAT]). These risk factors, together with the level of posttransplant renal function (the relationship between eGFR and CVD being similar to that seen in primary CKD), preexisting CVD, and the increasing age of RTR, contribute to the overall level of CV risk after transplantation. However, in spite of this, CVD and mortality rates appears to be stable or even falling in RTR, reflecting improved management of transplant recipients and improved graft outcomes.
Epidemiology of Posttransplant CVD
Registry and cohort studies, and a small number of clinical trials, have examined the natural history and determinants of CVD in RTR. In looking at these data it is important to consider background improvements in immunosuppressive therapy, the increasing age of transplant recipients, and the use of kidneys from extended criteria donors (where the expectations for graft function are less good). Each of these factors is likely to influence the pattern of CVD in RTR in the future. It is also important to remember that endpoints recorded in registries may be inaccurate, and investigators in clinical trials often pool CV endpoints that are believed to share common pathophysiologic mechanisms, which may not be true in atypical populations, including RTR. A final issue that is often ignored is that of “competing risk,” where an individual risk factor may contribute to more than one adverse outcome. In transplant recipients, for example, smoking may increase the risk of infection, malignancy, and CVD, thus diminishing the apparent effect on any single outcome.
The early studies of Kasiske and colleagues, who followed up more than 1000 RTRs in a single US center, demonstrated a high incidence of CVEs and CV mortality, and a high prevalence of preexisting CVD in RTR. They confirmed that conventional CV risk factors, including age, sex, smoking status, and the presence of diabetes mellitus (either preexisting or developing after transplantation), were associated with the development of CVEs (which they termed “coronary heart disease”). For each year of life, the risk of a CVE was increased by 3% to 5%; males and patients with diabetes had a twofold increase in risk of a CVE. However, the strongest risk factors were preexisting CHD, peripheral vascular disease, or cerebral vascular disease, reflecting the importance of the burden of CV disease that individual patients carry at the time of transplantation. Most of the risk factors they identified, such as preexisting disease, age, and sex, are irremediable, and it has proved more difficult to identify any relationship between modifiable risk factors and CVEs.
Kasiske’s initial analysis revealed no association between posttransplant levels of triglyceride, total or low-density lipoprotein (LDL) cholesterol, and CVE in RTR. However, a subsequent larger analysis did show an association of risk with hyperlipidemia, albeit only with very high levels of total cholesterol and increased risk of long-term CVE. Unlike the general population there is not a clear, progressive relationship between lipid levels and CVEs. This observation, in keeping with the pattern seen in patients with ESRD receiving maintenance dialysis, provides support for the notion that CVD in RTR differs from the traditional atherosclerotic model. Similar observations have been reported in single-center studies from Europe. Long-term follow-up of clinical trials provides additional data, with the benefit that endpoints are externally validated and more accurate than registry data. The ALERT (Assessment of LEscol in Renal Transplantation) and, more recently, the FAVORIT (Folic Acid for Vascular Outcome Reduction in Transplantation) studies have been used to provide data on CVEs collected during follow-up of potential interventions in large populations of RTR. In the ALERT study 2100 stable RTRs were randomized to receive placebo or fluvastatin (40–80 mg/day) and followed for up to 8 years. Compared with studies of statin therapy in nontransplant populations at comparable high risk of CVEs, there are clear differences. Nontransplant patients with dyslipidemia and a history of CAD are likely to have further coronary events, and the risk of cardiac death is approximately one-third that of a nonfatal event. In contrast, patients with ESRD receiving maintenance dialysis are much more likely to suffer cardiac death than a nonfatal coronary event. RTRs occupy a position intermediate between these populations, with the increase in CVEs associated with an equal risk of cardiac death and nonfatal coronary events. It is likely that this alteration in proportions reflects an increase in the risk of death due to primary arrhythmia or heart failure, a pattern similar to that seen in patients with congestive heart failure. Of note, around 10% of otherwise stable RTRs experienced a cardiac event during the first 5 years of follow-up—an event rate (2% per annum) comparable to the annual mortality rate of stable RTR and the annual graft failure rate after the first year.
In the FAVORIT study 4110 stable RTRs were randomized to high-dose folic acid, the primary endpoint being a vascular composite of myocardial infarction, CV death, resuscitated CVD, revascularization procedures (coronary and noncoronary), and stroke. Given these analyses and the potentially disparate nature of the pooled endpoints, it is perhaps not surprising that there was no benefit of the intervention, that LDL cholesterol had no relationship with the composite outcome, and that the main determinants were age, preexisting CVD, diabetes, systolic blood pressure, and low eGFR. Table 30.1 shows the relationship between risk factors and CVEs in this population.
RR | Confidence Interval | P | |
---|---|---|---|
Age | 1.13 | (1.08, 1.19) | <0.0001 |
Diabetes | 2.30 | (1.90, 2.80) | <0.0001 |
Smoking (current) | 1.38 | (1.05, 1.82) | 0.07 |
Cardiovascular disease | 2.06 | (1.71, 2.48) | <0.0001 |
Low-density lipoprotein | 1.01 | (0.98, 1.04) | 0.41 |
Systolic blood pressure | 1.17 | (1.11, 1.23) | <0.0001 |
Diastolic blood pressure | 0.89 | (0.81, 0.98) | 0.02 |
Body mass index | 0.91 | (0.84, 0.98) | 0.02 |
Lymphoproliferative disease | 0.84 | (0.70, 1.01) | 0.07 |
A prospective multinational study—the PORT (Patient Outcomes in Renal Transplantation) study —followed 23,575 adult RTRs for a median of 4.5 years. CVD disease was defined as a composite of proven myocardial infarction, coronary intervention, and cardiac death. The overall cumulative incidence was 3.1%, 5.2%, and 7.6% at 1, 3, and 5 years after transplantation, respectively. In the first year the distribution of events was nonfatal myocardial infarction (49%), coronary intervention (38%), and cardiac death (13%); beyond 1 year the corresponding values were 39%, 38%, and 23%. Conventional modifiable CV risk factors were very poor predictors of cardiac events, and varied with time after transplantation. Early events were predicted by age, male sex, history of cancer or diabetes, obesity, preexisting CV disease (CHD, peripheral vascular disease, or cerebrovascular disease), deceased donor transplant, and time on dialysis before transplantation. Conventional risk factors such as smoking, hypercholesterolemia, and hypertension were not significant, although they did correlate with a past history of CVD. Later events were dependent on poor graft function (low eGFR; and factors that adversely influence graft function such as acute rejection, delayed graft function, and posttransplant lymphoproliferative disease), the development of NODAT, and race.
Some of the differences in the analyses just discussed may be explained by pooling endpoints. The ALERT study also allows us to examine the relationship between risk factors and individual CVE (e.g., acute myocardial infarction [aMI] or cardiac death; Fig. 30.3 ), and to examine relationships masked by pooling of CVE with different determinants. In a multivariate analysis, the leading potentially remediable determinants of aMI (in addition to age, gender, and preexisting diabetes) were lipid levels. As in the general population, all major serum lipid subfractions were associated with aMI: total and LDL cholesterol, and triglyceride with an increased risk; HDL cholesterol with reduced risk. In contrast, no lipid subfraction was significantly associated with cardiac death, the main determinants of which were low eGFR and LVH, particularly when associated with subendocardial ischemia (LVH with “strain”) and pulse pressure. These observations strongly support the established literature on “uremic cardiomyopathy” and suggest that severe LVH, driven by renal dysfunction, hypertension, and the presence of LVH at the time of transplantation, may lead to increased risk of death due to heart failure or arrhythmia—with or without coexistent coronary disease.
The key message from these observations is that RTRs do suffer from CAD (fatal and nonfatal MI), the determinants of which are similar to the general population. However, cardiac death is perhaps a greater problem, the determinants of which are LVH, vascular stiffness, and hypertension. Studies by other investigators, including Abbott and Rigatto support these findings and underscore the observation that noncoronary events such as heart failure are common. In addition, they demonstrated that specific transplant risk factors including graft dysfunction (specifically graft failure) were associated with an approximately threefold increase in CVEs, including heart failure. Anemia proved to be a risk factor for the development of heart failure, although with improved anemia management a relationship with hemoglobin is now difficult to confirm.
Novel and Transplant-Specific Risk Factors
In the general population, the limited predictive ability of conventional CV risk factors has led to the search for novel risk factors and potential therapeutic targets. The presence of inflammation has become a central mechanism, with the recognition that inflammatory cells are involved in atherosclerosis and that circulating markers of inflammation, such as C-reactive protein, can identify patients at increased risk of atherosclerotic vascular disease who may benefit from established treatments. In RTRs there are similar initiatives. Markers of inflammation and circulating inhibitors of endothelial function have been studied in transplantation and are associated with an increased risk of CVD. Patients with simple features of inflammation, such as low albumin, are at higher risk. There are also transplant-specific risk factors, including those factors that contribute to poor graft function. These include the occurrence and severity of acute rejection episodes, delayed graft function, chronic rejection, cytomegalovirus infection, and other factors.
Specific Risk Factors and Management
In this section we will cover individual CV risk factors, their role, and their management. As noted previously, it is important to realize that transplantation is one phase in the course of progressive renal disease. Patients bring with them to transplantation accumulated risk, much of which is irremediable. For example, vascular stiffness and calcification, which develop in advanced CKD, contribute to hypertension after transplantation. Moreover, nearly all of the immunosuppressive drugs that have revolutionized the management of transplant recipients have effects on CV risk factors—some good, such as higher GFR; others potentially bad, such as hypertension and dyslipidemia. The pattern of effects of immunosuppressive agents is shown in Table 30.2 , and discussed in more detail later.
Cardiovascular Risk Factors | Steroids | Azathioprine/MMF | Belatacept | Cyclosporine | Tacrolimus | mTOR Inhibitors |
---|---|---|---|---|---|---|
Hypertension | 1.1 ↑ | 1.2 ↔ | 1.3 ↔ | 1.4 ↑ | 1.5 ↑ | 1.6 ↔ |
Left ventricular hypertrophy | 1.7 ↑ | 1.8 ↔ | 1.9 ↔ | 1.10 ↑ | 1.11 ↑ | 1.12 ↔ |
Total cholesterol | 1.13 ↑ | 1.14 ↔ | 1.15 ↔ | 1.16 ↑ | 1.17 ↑ | 1.18 ↑ |
Low-density lipoprotein | 1.19 ↑ | 1.20 ↔ | 1.21 ↔ | 1.22 ↑ | 1.23 ↑ | 1.24 ↑ |
Triglycerides | 1.25 ↑ | 1.26 ↔ | 1.27 ↔ | 1.28 ↑ | 1.29 ↑ | 1.30 ↑ |
Diabetes mellitus | 1.31 ↑ | 1.32 ↔ | 1.33 ↔ | 1.34 ↑ | 1.35 ↑ | 1.36 ↑ |
Renal function | 1.37 ↔ | 1.38 ↔ | 1.39 ↔ | 1.40 ↓ | 1.41 ↓ | 1.42 ↔ |
Hypertension and Uremic Cardiomyopathy
Hypertension is an almost invariable accompaniment of renal transplantation—a consequence of preexisting hypertension at the time of transplantation and the effects of immunosuppressive agents. Both increased vascular resistance and increased intravascular volume contribute to the development of hypertension. With declining GFR as CKD progresses, salt and water excretion are impaired, and volume-dependent mechanisms assume greater importance; after transplantation the contribution of volume-dependent mechanisms will similarly depend on the level of graft function, Details on the role of vasoconstrictor mechanisms and the relevance for treatment are described later.
Hypertension in RTRs is known to be associated with poorer patient and graft outcomes. There is no trial evidence for specific blood pressure targets in the RTR population, but data from the European Registry support the need to manage hypertension and inform treatment targets. In a series of RTRs with a functioning graft 1 year after transplantation, Opelz and colleagues demonstrated that blood pressure—recorded at outpatient clinics—is a major determinant of long-term patient and graft survival, albeit not independently from graft function. Current guidelines recommend a blood pressure target <130/80 mmHg, irrespective of the level of proteinuria, but in fact, the data suggest that graft outcomes start to deteriorate when systolic blood pressure is above 120 mmHg. Similarly, there appears to be an important relationship between blood pressure across the range from “normal” to hypertensive and the development of posttransplant CV disease ( Fig. 30.4 ). Epidemiologic studies, which include the placebo arms of interventional trials in transplant recipients, have confirmed that hypertension is associated with CVEs, specifically stroke, cardiac death, and heart failure, rather than nonfatal coronary events. Hypertension was the strongest determinant of cardiac death in the ALERT study, the most significant blood pressure parameters being systolic blood pressure and pulse pressure, both markers associated with vascular stiffness, rather than diastolic blood pressure.
Most clinics use “office-based” blood pressure measurements, using a standard sphygmomanometer. Blood pressure should preferably be assessed using repeated measurements, with the patient seated after a period of rest, or measuring ambulatory or home blood pressures as a more informative measure. In patients with essential hypertension these methods are recommended for patients with resistant hypertension, or where “white coat” syndrome is suspected. In transplant recipients ambulatory recordings are associated with prognosis, and loss of diurnal profile, or loss of the “nocturnal dip,” confers additional prognostic information. These methods should be used in patients with poor blood pressure control and may provide additional information in clinical trials.
Hypertension exhibits unfavorable effects indirectly through development of end-organ damage in RTRs—specifically proteinuria and LVH. Hypertension is the major contributor to LVH in patients with ESRD, including RTRs, and LVH is strongly associated with poor outcome in RTRs. The pathophysiology of LVH in CKD (uremic cardiomyopathy) is marked by the presence of subendocardial ischemia and myocardial fibrosis. Fibrosis is believed to promote aberrant conduction and is associated with markers of arrhythmogenicity, such as prolonged QT interval and abnormal T-wave alternans, which provide the likely link to fatal arrhythmias and sudden cardiac death. Arrhythmias may be spontaneous or complicate otherwise minor ischemic episodes. These observations identify hypertension, LVH, and electrocardiographic abnormalities as markers of adverse outcome in RTR, and as potential targets for intervention. The less common manifestation of uremic dilated cardiomyopathy (with systolic dysfunction) may be a sequel of LVH or may be associated with (often silent) CHD. Uremic cardiomyopathy develops, primarily, during the time patients spend with advanced CKD, and on maintenance dialysis programs, and is therefore common in new transplant recipients. Although there are studies that suggest that the manifestations of uremic cardiomyopathy may improve after transplantation, with apparent regression of LVH and improved systolic function, these studies may not reflect the true situation. Echocardiographic analyses are highly dependent on chamber diameters (e.g., in the estimation of left ventricular [LV] mass) which are, in turn, dependent on hydration status. Although patients with advanced CKD and treated by dialysis have a tendency to volume overload, this improves after successful transplantation, correcting artifactual overestimation of LV mass and systolic dysfunction. Hence, studies that use volume-independent technology (specifically cardiac magnetic resonance imaging) have not shown similar improvement. Long-term risks associated with the various manifestations of uremic cardiomyopathy in patients receiving maintenance dialysis are carried forward in patients who undergo transplantation. After transplantation, there are limited data on uremic cardiomyopathy, restricted to LVH, suggesting that effective blood pressure control and the avoidance of calcineurin inhibitors (CNIs) may reduce LVH. A series of small short-term studies suggests that the use of dihydropyridine calcium antagonists ; mammalian target of rapamycin (mTOR) inhibitors, sirolimus or everolimus, in place of CNI; or CNI withdrawal is associated with improvement in blood pressure and regression—or lack of progression—of LVH in RTRs.
There is evidence of increased or inappropriate activation of vasoconstrictor mechanisms in RTRs, including the sympathetic nervous system, the renin–angiotensin system, and endothelin both in humans and experimental animals. These, coupled with evidence of impaired endothelium-dependent (nitric oxide-mediated) vascular relaxation, shift the balance toward vasoconstriction. The mechanisms underlying these phenomena are less well understood. Corticosteroids are associated with hypertension in other clinical conditions and have two principal actions—to promote retention of salt and water due to actions of corticosteroids on the kidney and to enhance sympathetic activity, leading to increased vascular tone. CNIs cause hypertension through direct renal sodium retention and increased vasoconstrictor tone as well as indirectly via renal impairment, as a consequence of the nephrotoxic effects of CNI.
Several short-term studies have demonstrated that the most commonly used antihypertensive drugs, such as angiotensin receptor blockers, angiotensin-converting enzyme (ACE) inhibitors, and calcium channel blockers, have effects on blood pressure comparable to those seen in other populations. Dihydropyridine calcium channel antagonists, such as nifedipine and amlodipine, may attenuate the nephrotoxic effects of CNI and have been favored in the early phases after transplantation. Blockers of the renin–angiotensin system have been favored in patients with proteinuria and LVH, although the uptake has been slow because of concerns about the possible adverse effects in patients with undiagnosed, functional stenosis of the single transplant renal artery. Caution should also be taken with regard to hyperkalemia in the context of ACE inhibition and CNI. More radical approaches to the treatment of hypertension, such as embolization or laparoscopic removal of the native kidneys, have been employed and may be effective. Whereas patients with bilateral native nephrectomy before transplantation (including pediatric patients) may have good blood pressure control, the benefits are less clear in patients with established hypertension after transplantation.
Patient and graft survival is associated with prescription of ACEI/ARB in retrospective analyses ( Fig. 30.5 ). There are few prospective trials of antihypertensive treatment in RTRs. In recent years, two trials have assessed the prescription of ACE inhibitors on “hard” CV endpoints, with inconclusive results. Paoletti et al. report that treatment with lisinopril (5 mg once daily then titrated to response) is associated with a reduction in major CVEs and a composite endpoint of death, major CVE, renal graft loss, or creatinine doubling, compared with placebo. Knoll et al. report no effect of ramipril 5 mg once daily (OD) versus placebo on doubling of creatinine, graft loss, or all-cause mortality. Most RTRs with hypertension require treatment with more than one antihypertensive agent. In the absence of specific evidence of CV risk reduction with any particular class of drug, the choice of antihypertensive agent, as in other populations, should consider the need to treat other comorbidities, for example, beta-blockers for patients with symptomatic angina, and blockers of the renin–angiotensin system for patients with proteinuria.
Phosphate and the FGF23-Klotho Axis
FGF23, a phosphaturic hormone, works in conjunction with vitamin D and parathyroid hormone (PTH) to maintain correct serum levels of phosphorus. Phosphate handling and metabolism is impaired early in the course of CKD, and FGF23 has been established as a more sensitive biomarker for disordered phosphate metabolism in CKD than serum phosphate or PTH. FGF23 becomes elevated early in the disease course, whereas phosphate and PTH may become abnormal at GFR <30 (CKD stage 4), FGF23 levels are elevated even in CKD stage 2, with around one-third of patients with GFR 60 to 69 with elevated FGF23 levels.
FGF23 appears to be responsible for persistent hypophosphatemia seen in the early posttransplant period, but FGF23 levels usually will fall around 3 months after transplantation, then can reach a state of normal or near-normal from 1 year after transplant. Higher levels of FGF23 are again seen as transplant function declines. In a cross-sectional, observational study of 279 stable, prevalent RTRs, fewer than 50% of those with CKD stage 1 or 2 had normal FGF23 and PTH, and this reduced to 26.3% in those with more advanced transplant CKD. FGF23 is associated with LVH, inducing hypertrophy of isolated cardiac myocytes, and in in vivo mouse models, and this effect can be ameliorated by administration of FGF23 antagonists, independent of blood pressure. FGF23 is associated with cardiovascular and all-cause mortality, and allograft loss after transplantation, after adjustment for known cardiovascular risk factors and markers of mineral handling. It is likely that LVH induced by elevation of FGF23 explains some of the excess CV morbidity and mortality in RTRs.
Independent of FGF23, phosphate is a risk factor for vascular disease in patients with CKD and renal transplantation. By contrast to FGF23, this is likely mediated through direct effects on blood vessels, and by promoting vascular calcification (discussed later). In 1501 patients from the Chronic Renal Insufficiency Cohort (CRIC), coronary and thoracic aorta calcification were measured by computed tomography scan. Serum phosphate, but not FGF23, was found to be associated with presence and severity of arterial calcification, through a direct effect on inducing calcification of vascular smooth muscle cells. Furthermore, in vitro studies support a direct effect of high phosphate on endothelial function (via interference with the nitric oxide pathway), and high dietary phosphate load effects on vascular function, with impairment of vessel relaxation, independent of serum phosphate. Thus although FGF23 and phosphate are intrinsically linked in maintaining phosphate homeostasis, they appear to contribute to cardiovascular risk in RTRs independently and via distinct mechanisms.
Vascular Stiffness and Calcification
Vascular calcification (particularly of the medial arterial/arteriolar layer) and stiffness (secondary to calcification or vascular hypertrophy) are common in the setting of progressive CKD, ESRD, and in RTRs, Calcification persists and appears to continue to progress after renal transplantation, although vascular stiffness may be partially reversed after kidney transplantation. Stiff, calcified vessels increasing systolic hypertension, increase afterload, and contribute to the development of LVH. Moreover, they are also linked to adverse CV outcomes in RTR, with coronary artery calcification at time of transplantation also being predictive of cardiac events, even in those asymptomatic patients with no prior history of CVD. These measures provide potential short-term surrogate endpoints for trials of CV interventions in this patient group.
There are no treatments proven to improve vascular calcification or stiffness in RTRs. Vitamin K deficiency is a potentially remediable risk factor of vascular stiffness and calcification. Vitamin K is essential to facilitate carboxylation (activation) of various Gla proteins, including Matrix Gla protein—an enzyme that opposes vascular calcification in its active form. Vitamin K deficiency is common in RTRs and is associated with mortality. Vitamin K supplementation appears to improve vascular/valvular calcification and vascular stiffness in some populations. Studies of vitamin K supplementation in CKD and dialysis are underway.
Guidelines and Observed Patterns of Usage
There are numerous guidelines on the management of hypertension after transplantation. Although these are based on the evidence discussed in this chapter, they also draw on the practical expertise of the guideline committees. In the absence of studies that have addressed appropriate targets for blood pressure control in RTR, they have endorsed targets derived from observational studies, such as that of Opelz and colleagues and targets taken from other populations, specifically those with CKD. Thus the targets are arbitrary and are not aimed, for example, at regression of LVH or specific reversal of proteinuria. The recently published Kidney Disease Improving Global Outcomes (KDIGO) guidelines suggest a target of 130/80 mmHg in patients with diabetes or proteinuria, the use of lifestyle modifications including salt restriction, and the use of blockers of the renin–angiotensin system as first-line therapy. Moreover, they endorse the use of specific targets and agents determined by comorbidity. The European and American Transplant Society guidelines are broadly concordant.
The implementation of guidelines and the adoption of new practices is dependent on the behavior of clinicians as much as the available evidence. Two large-scale studies have examined the use of CV drugs and risk management in RTR in North America and in Australasia. The results are encouraging in that they show a progressive increase in usage with time in recent years. The observation that 50% of patients receive a blocker of the renin–angiotensin system suggests that reluctance to use these agents is less than it was ( Fig. 30.6 ). Data on achieved blood pressure and the use of individual agents are difficult to obtain. In our own center, a proportion of patients remain uncontrolled despite therapy; the majority of those controlled require multiple agents when we considered a historical target of 140/90 mmHg. Whether or not blood pressure targets are appropriate and whether one agent has benefits over another require to be assessed in a prospective clinical trial.
Dyslipidemia
Dyslipidemia is almost an invariable accompaniment of renal transplantation. The pattern typically comprises elevated total and LDL cholesterol, triglycerides, and HDL cholesterol. There are also increased concentrations of intermediate—highly atherogenic—lipoproteins, including small, dense LDL. The mechanisms behind this dyslipidemia include impaired renal function and the influences of immunosuppressive agents. The mechanisms that lead to dyslipidemia in CKD will contribute to a varying degree in RTR, depending on the achieved level of renal function, and explain the varying patterns of dyslipidemia observed. In CKD the typical abnormalities are high triglycerides, low HDL cholesterol, elevated intermediate-density lipoprotein (IDL) cholesterol, and a neutral effect on LDL and total plasma cholesterol, the main mechanisms being reduced activity of lipoprotein lipase, and hepatic lipase and lipoprotein receptor dysfunction. To the effects of impaired renal function are added the effects of individual antirejection agents that have specific, often synergistic effects on serum lipid levels. Corticosteroids cause an increase in total and LDL cholesterol, triglycerides, and HDL cholesterol; CNIs—cyclosporine and, to a lesser extent, tacrolimus—at commonly used doses increase total and LDL cholesterol; and mTOR inhibitors—sirolimus and everolimus—increase total cholesterol, LDL cholesterol, HDL cholesterol, and triglyceride in a dose-dependent manner. Typically, in the first 6 weeks after transplantation, immunosuppression, normalization of renal function, and increased appetite are associated with an increase in total cholesterol of around 1.0 to 1.5 mmol/L, an increase in LDL cholesterol of around 1 mmol/L, and increased triglyceride and HDL cholesterol.
Statin therapy is one of the few interventions to be tested in a large CV outcome study in RTR. The main ALERT trial studied 2100 stable, cyclosporine-treated RTRs, followed for up to 6 years, and randomized initially to fluvastatin 40 to 80 mg daily or placebo. The primary endpoint was a composite of myocardial infarction, cardiac death, stroke, and coronary intervention. A 2-year extension, where all patients were offered fluvastatin 80 mg/day, prolonged follow-up to 8 years. With hindsight, the core study was underpowered for the chosen composite primary endpoint, although statin therapy was associated with a 35% reduction in MI. With prolonged follow-up, there was a significant reduction in the primary endpoint in those patients randomized to statin therapy and to a variety of individual cardiac endpoints. The main effect was on lipid-dependent endpoints such as MI (see Fig. 30.3 ). Fluvastatin reduced LDL by 1 mmol/L for the duration of the study and was well tolerated, with placebo-like side effects. Post hoc analyses of this study revealed that early introduction after transplantation was associated with additional benefit, Overall, the ALERT study can be summarized as showing that fluvastatin has beneficial effects on the secondary dyslipidemia associated with renal transplantation, that this translates into reduced incidence of MI (with a lesser reduction in other CV events), and that to maximize benefits, early initiation of therapy is important.
Fluvastatin is not metabolized by CYP3A4, an enzyme inhibited by CNIs. In patients receiving CNI, use of statins metabolized by CYP3A4 (specifically simvastatin, lovastatin, and, to a lesser extent, atorvastatin) is associated with increased statin bioavailability and with increased efficacy and side effects. An important, underappreciated message is that, with the exception of fluvastatin and pravastatin, statins should be started at very low dose and monitored cautiously in CNI-treated RTR. A study of fluvastatin at the time of transplantation (SOLAR), which proved that the pleiotropic effects of statins on lymphocyte function do not reduce the risk of acute rejection in RTR, showed no increase in adverse effect. Thus of the available statins, there is most evidence for the safety of fluvastatin, particularly in the perioperative period.
Recently, the SHARP study examined the use of simvastatin plus ezetimibe on dyslipidemia and outcomes in 9000 patients with CKD, including 3000 on maintenance dialysis. Although medication was not continued in patients who received transplants, there was an overall benefit in the incidence of atherosclerotic CVEs that lends support to the use of statin-based therapy for patients with progressive CKD, including those who will ultimately receive transplants. This study adds to the ALERT data set, and is consistent with registry data and retrospective uncontrolled studies.
The use of statins has been adopted by guidelines for minimizing CV risk in RTR. These guidelines have tended to adopt lipid targets from the general population (LDL cholesterol for adult patients of 2.6 mmol/L), as there are inadequate data on targets specific for the transplant population. Of interest, however, is the fact that the most recent KDIGO guidelines ( www.kdigo.org ) on lipid lowering in CKD have proposed that it is more important to establish patients on statin therapy than it is to achieve a target, and they have not recommended a target-driven approach in this patient population (including RTR). Although there is a strong rationale for statin use, one report suggests much lower uptake of therapy in RTR compared with other high-risk populations. However, there is a pattern of increasing use that suggests that transplant clinicians may simply be cautiously following the trend in other patient groups (see Fig. 30.6 ).
One reason for the slow adoption of statin therapy and CV risk management in RTR is the inherent relationship between immunosuppression and dyslipidemia; the expectation is that posttransplant reduction of immunosuppression will correct dyslipidemia. Many studies have investigated the short-term effect of modification of immunosuppressive therapy alone on hyperlipidemia in transplant recipients, including steroid withdrawal or avoidance or CNI withdrawal. The only study to directly compare modification of immunosuppression with the initiation of lipid-lowering therapy is the study of Wissing and colleagues. In this study patients were switched from cyclosporine to tacrolimus-based therapy, and this was compared with the addition to atorvastatin. Although tacrolimus-based therapy was associated with a reduction in total LDL cholesterol and triglycerides, patients on cyclosporine and atorvastatin had lipid levels comparable to those on tacrolimus and atorvastatin combined. Thus modification of the CNI provided no additional benefit to statin therapy. Additionally, the fact that some components of the dyslipidemia (e.g., hypertriglyceridemia) are insensitive to statin therapy, and that both atherogenic and potentially protective lipid subfractions (HDL cholesterol) are increased with immunosuppression has heightened reluctance to use statins. Most clinicians and patients are reluctant to change immunosuppression primarily on the basis of dyslipidemia, without data to support long-term outcomes with this strategy.
Finally, there is little trial evidence, and considerable negative information, on the use of alternative lipid-lowering agents, specifically fibrates and nicotinic acid derivatives, in transplantation. At present, their use is not encouraged by guidelines, and any use, particularly as add-on therapy, should be carefully monitored.
Renal Function
In the general population, renal dysfunction (CKD) is an established risk factor for CV disease. Although statistically an independent risk factor, CKD is associated with dyslipidemia and hypertension. The “independent” effect of renal dysfunction may reflect the presence of, as yet poorly defined “uremic toxins” or factors known to be associated with renal impairment, such as hyperphosphatemia or elevated FGF23. These same factors are likely to play a role in the pathophysiology of CVD in RTR with suboptimal renal function. Until recently, renal function had not been reported routinely as an outcome of trials of immunosuppression in renal transplantation, where the primary outcome is limited to the incidence of acute rejection and graft and patient survival. Recent major trials have reported mean (or other summary measures of) serum creatinine levels (or eGFR). This neglects the fact that mean levels are of relevance to populations and not to individuals, and does not include proteinuria or other renal factors associated with poor graft outcomes. Improved reporting of renal outcomes in trials of transplantation is important, specifically, to permit long-term graft and patient outcomes (see later).
Nonetheless, graft function has emerged as a strong determinant of graft and patient survival, and of CV risk in transplant recipients. Post hoc analyses of the two largest CV outcome trials in RTR, FAVORIT and ALERT, have shown renal function to predict the risk of graft loss and of patient outcomes. Fig. 30.7 shows the relationship between GFR and outcomes in the FAVORIT study. These data show the importance of achieving good graft function in RTR. The available data support the use of low-dose tacrolimus in combination with mycophenolic acid/mycophenolate and corticosteroids (with antiinterleukin-2 receptor antibodies) as the most effective primary immunosuppressive strategy to produce optimal graft function. The SYMPHONY study compared cyclosporine and tacrolimus-based immunosuppressive regimens with sirolimus as an adjunct agent. Low-dose tacrolimus (with a target blood level of 4–7 ng/mL) in combination with mycophenolate mofetil and corticosteroids was the best tolerated and most effective regimen, lending strong support to the established pattern of use in the clinical community. Moreover, the achieved level of renal function was best in this group, with a mean eGFR of 65.4 mL/min compared with 56.7 to 59.4 mL/min in the other groups. These outcomes persisted 3 years after transplantation, albeit with lesser differences. CNI minimization, for example, by switching to an mTOR inhibitor (sirolimus or everolimus) has been examined in a variety of trials, incorporating a switch from CNI-based to mTOR inhibitor-based therapy at various time points from 7 weeks to years after transplantation. Although this strategy has been associated with an increased risk of acute rejection and the side effects limit the general applicability, renal function is generally improved after early protocol-driven switching.