October 2019: Abstract Round Up

ACUTE KIDNEY INJURY

Effects of Acute Kidney Injury on Urinary Protein Excretion

Journal of the American Society of Nephrology. 2019;30(7):1271-1281

There are few available data on the potential effects of acute kidney injury (AKI) on proteinuria, despite the fact that proteinuria is a strong risk factor for future loss of renal function. Researchers, led by Chi-yuan Hsu, MD, MS, conducted an analysis of data from the ASSESS-AKI (Assessment, Serial Evaluation, and Subsequent Sequelae of AKI) study, and from a subset of participants in the CRIC (Chronic Renal Insufficiency Cohort) study recruited from Kaiser Permanente Northern California. Both prospective cohort studies included annual assessment of urine protein-to-creatinine ratio, estimated glomerular filtration rate (eGFR), blood pressure, and medication use.

For hospitalized patients, an episode of AKI was defined using inpatient serum creatinine measurement taken as part of clinical care (peak/nadir inpatient serum creatinine ≥1.5 mg/dL). The analysis included mixed effects regression to examine change in log-transformed urine protein-to-creatinine ratio after AKI, controlling for time-updated covariates.

Among the 2048 eligible participants, baseline median eGFR was 62.9 mL/min/1.73 m2 and median urine protein-to-creatinine ratio was 0.12 g/g. During 9271 person-years of follow-up, 324 participants experienced at least one episode of hospitalized AKI. Of first AKI episodes, 50.3%  were Kidney Disease Improving Global Outcomes stage 1, 23.8% were stage 2, and 25.9% were stage 3. There was an independent association between an episode of hospitalized AKI and a 9% increase in the urine protein-to-creatinine ratio in multivariable analysis.

In conclusion, the researchers said, “Our analysis of data from two prospective cohort studies found that hospitalization for an AKI episode was independently associated with subsequent worsening of proteinuria.”

CHRONIC KIDNEY DISEASE

Magnesium Oxide Slows Progression of Coronary Artery Calcification

Journal of the American Society of Nephrology. 2019;30(7):1073-1085

Development of strategies for the management of coronary artery calcification (CAC) in patients with chronic kidney disease (CKD) remains clinically challenging. Results of experimental studies have suggested that magnesium inhibits vascular calcification and that uremic toxin indoxyl sulfate aggravates it.

Yusuke Sakaguchi, MD, PhD, and colleagues recently conducted a 2-year, open-label, randomized controlled trial designed to examine the efficacy of magnesium oxide and/or the oral carbon adsorbent AST-120 for slowing progression of CAC in patients with CKD. Eligible patients had stage 3-4 CKD with risk factors for CAC (diabetes mellitus, history of cardiovascular disease, high low-density lipoprotein cholesterol, or smoking). Patients were randomly assigned in a two-by-two factorial design to either an magnesium oxide group or a control group, and to an AST-120 group or a control group. The primary outcome of interest was percentage change in CAC score.

Following results of an interim analysis with the first 125 enrolled participants, the study was prematurely terminated. At the time of study termination, 96 of the 125 enrolled patients had completed the study. The interim analysis demonstrated that the median change in CAC score was significantly smaller for magnesium oxide versus control (11.3% vs 39.5%). In addition, the proportion of patients with an annualized percentage change in CAC score of ≥15% was significantly smaller for the magnesium oxide group than for the control group (23.9% vs 62.0%). However, magnesium oxide did not suppress the progression of thoracic aorta calcification. The dropout rate was higher in the magnesium oxide group than in the control group ( 27% vs 17%); participants dropped out primarily due to diarrhea.

There was no significant difference in percentage in CAC score between the AST-120 group and the control group.

“Magnesium oxide, but not AST-120, appears to be effective in slowing CAC progression. Larger-scale trials are warranted to confirm these findings,” the researchers said.

DAPT Duration in Patients with CKD and Drug-Eluting Stents

Clinical Journal of the American Society of Nephrology. 2019;14(6):810-822

It is not known whether prolonged dual antiplatelet therapy (DAPT) is more protective in patients with chronic kidney disease (CKD) and drug-eluting stents compared with shorter duration of DAPT. Thomas A. Mavrakanas, MD, MSc, FRCPC, FASN, and colleagues conducted a literature search and meta-analysis to examine whether there is an association between shorter DAPT in patients with drug-eluting stents and CKD and lower mortality or major adverse cardiovascular events compared with longer duration of DAPT.

Randomized trials were identified using a Medline literature search; the trials compared varying DAPT duration strategies; eligible trials included patients with CKD. The primary outcome of included trials was a composite of all-cause mortality, myocardial infarction, stroke, or stent thrombosis (definite or probable). The secondary outcome was major bleeding. A random-effects model was used to estimate the risk ratio (RR).

The meta-analysis included five randomized trials representing 1902 patients with CKD. Short DAPT, defined as ≤6 months, was associated with a similar incidence of the primary outcome, compared with 12-month DAPT in patients with CKD (48 vs 50 events; RR, 0.93; 95% confidence interval [CI], 0.64-1.36; P=.72). There was also an association between 12-month DAPT and a similar incidence of the primary outcome compared with extended DAPT, defined as ≥30 months, in the CKD subgroup (35 vs 35 events; RR, 1.04; 95% CI, 0.67-1.62; P=.87).

Numerically lower rates of major bleeding events were detected with shorter versus 12-month DAPT (9 vs 13 events; RR, 0.69; 95% CI, 0.30-1.60; P=.39) and 12-month versus extended DAPT (9 vs 12 events; RR, 0.83; 95% CI, 0.35-1.93; P=.66) in the patients with CKD.

In conclusion, the researchers said, “Short DAPT does not appear to be inferior to longer DAPT in patients with CKD and drug-eluting stents. Because of imprecision in estimates (few events and wide confidence intervals), no definite conclusions can be drawn with respect to stent thrombosis.”

DIABETES

Hyperfiltration and Long-term Outcomes in Diabetic Kidney Disease

Clinical Journal of the American Society of Nephrology. 2019;14(6):854-861

Researchers have hypothesized that glomerular hyperfiltration is a contributing factor to the development of diabetic kidney disease (DKD). Mark E. Molitch, MD, and colleagues conducted an analysis of glomerular filtration rate (GFR) follow-up data on patients with type 1 diabetes who were undergoing 125I-iothalmate clearance on entry into the Diabetes Control and Complications Trial (DCCT)/Epidemiology of Diabetes Interventions and Complications Study.

The study cohort included patients with type 1 diabetes who underwent an 125I-iothalamate clearance (iGFR) at DCCT baseline. The association between the baseline hyperfiltration status and subsequent risk of reaching an estimated GFR <60 mL/min/1.73 m2 was calculated using Cox proportional hazards models.

Of the 446 eligible participants, 24% (n=106) had hyperfiltration at baseline. Hyperfiltration was defined as iGFR levels ≥140 mL/min/1.73 m2; secondary thresholds were 130 or 150 mL/min/1.73 m2.

Median follow-up was 28 years. During that time, 53 participants developed an eGFR <60 mL/min1.73 m2. Among the participants with hyperfiltration at baseline, the cumulative incidence of eGFR <60 mL/min/1.73 m2 at 28 years of follow-up was 11.0%, compared with 12.8% among participants with baseline GFR <140 mL/min/1.73 m2.

In an unadjusted Cox proportional model, there was no significant association between hyperfiltration and subsequent risk of developing eGFR <60 mL/min/1.73 m2 (hazard ratio [HR], 0.83; 95% confidence interval [CI], 0.43-1.62); the association did not reach statistical significance in the adjusted model (HR, 0.77; 95% CI, 0.38-1.54). Findings were similar upon application of alternate thresholds to define hyperfiltration (130 or 150 mL/min/1.73 m2).

In summary, the researchers said, “Early hyperfiltration in patients with type 1 diabetes was not associated with a higher long-term risk of decreased GFR. Although glomerular hypertension may be a mechanism of kidney injury in DKD, higher total GFR does not appear to be a risk factor for advanced CKD.”

DIALYSIS

High Risk CKD-MBD Phenotypes Identified

Nephrology Dialysis Transplantation. 2019;34(4):682-691

Due primarily to difficulties in defining high-risk phenotypes based on serum biomarkers, clinical management of chronic kidney disease-mineral bone disorder (CKD-MBD) remains challenging. Luca Neri, MD, PhD, and colleagues recently conducted a 5-year follow-up study in a large, multinational cohort of chronic dialysis patients to examine the prevalence and outcomes of 27 mutually exclusive CKD-MBD phenotypes.

In this historical cohort study, all hemodialysis patients registered in EuCLiD® (European Clinical Database) on July 1, 2011, across 28 countries in Europe, the Middle East and Africa (EMEA), and South America were enrolled. The researchers created the 27 phenotypes based on combinations of serum parathyroid hormone (PTH), phosphorus, and calcium 6-month averages (L, low; T, target; H, high). Outcome risk score-adjusted proportional hazard regression was used to test the association between CKD-MBD phenotypes and the 5-year risks of mortality and hospitalization.

A total of 35,721 patients were eligible for the analysis. CKD-MBD control was generally poorer in Eastern European and South American countries when compared with Western European countries (prevalence ratio, 0.79; P<.001). Overall, there were 15,795 deaths (126.7 deaths per 1000 person-years; 95% confidence interval [CI], 124.7-128.7); 18,014 patients had at least one hospitalization (203.9 hospitalization events per 1000 person-years; 95% CI, 201.0-206.9); and the incidence of the composite end point was 280.0 events per 1000 person-years (95% CI, 276.6-283.5).

In the fully adjusted model, the relative mortality risk ranged from hazard ratio (HR) 1.07 (PTH/calcium/phosphorus: TLT) to HR, 1.59 (PTH/calcium, phosphorus: LTL); the relative composite end point risk ranged from HR, 1.07 (PTH/calcium/phosphorus: TTH) to HR, 1.36 (PTH/calcium/phosphorus: LTL).

In summary, the researchers said, “We identified several CKD-MBD phenotypes associated with reduced hospitalization-free survival and increased mortality. Ranking of relative risk estimates or excess events concurs in informing healthcare priority setting.”

Dialysate Cooling Ameliorates Decline in Residual Renal Function

Journal of the American Society of Nephrology. 2019;30(6):1086-1095

Survival in patients with end-stage renal disease is associated with residual renal function (RRF); however, RRF declines following initiation of dialysis. Prior studies demonstrated that dialysate cooling reduced hemodialysis-induced circulatory stress and protected the brain and heart from ischemic injury. It is not known whether renal perfusion is affected by hemodialysis-induced circulatory stress and if it can be ameliorated with dialysate cooling to reduce RRF loss.

Raanan Marants, MSc, and colleagues utilized renal computed tomography perfusion imaging to scan 29 patients undergoing continuous dialysis under standard (36.5° C dialysate temperature) conditions; an additional 15 patients were scanned under both standard and cooled (35.0° C) conditions. Imaging was performed immediately before, 3 hours into, and 15 minutes after hemodialysis sessions.

Renal perfusion decreased 18.4% during standard hemodialysis (P<.005) and correlated with myocardial injury (r=–0.33; P<.05). In the sessions with dialysate cooling, patients experienced a 10.6% decline in perfusion; the difference was not statistically different from the decline with standard hemodialysis. Ten of the 15 patients showed improved or no effect on myocardial stunning.

“This study shows an acute decrease in renal perfusion during hemodialysis, a first step toward pathophysiologic characterization of hemodialysis-mediated RRF decline. Dialysate cooling ameliorated this decline but this effect did not reach statistical significance. Further study is needed to explore the potential of dialysate cooling as a therapeutic approach to slow RRF decline,” the researchers said.

Transplantation

Pre-transplant anti-HLA Antibodies and Graft Survival

Nephrology Dialysis Transplantation. 2019;34(6):1056-1063

There is an association between pre-transplant donor-specific anti-human leucocyte antigen (HLA) antibodies and impaired kidney graft survival. However, the clinical significance of non-donor-specific anti-HLA antibodies (nDSAs) is uncertain. Laura A. Michielsen, PhD, and colleagues conducted a paired kidney graft study to compare the clinical relevance of DSAs and nDSAs.

The post hoc paired kidney graft analysis was conducted as part of a Dutch multicenter study evaluating all transplantation between 1995 and 2005 with available pre-transplant serum samples. A Luminex single-antigen bead assay was used to detect anti-HLA antibodies.

There were 3237 eligible deceased donor transplantations; of those, 115 recipient pairs receiving a kidney from the same donor with one recipient being DSA positive and the other without anti-HLA antibodies were identified. Ten-year death-censored graft survival was significantly lower in patients with pre-transplant DSAs (55% vs 82%; P=.0001).

Of 192 pairs with one recipient as nDSA positive (against Class I and/or II) and the other without anti-HLA antibodies, there was no significant difference in graft survival between the two groups (74% vs 77%; P=.79). In patients with both nDSAs Class I and II, there was a trend toward a lower graft survival (58%; P=.06).

In a small group of 42 recipient pairs, 10-year graft survival in recipients was 49% versus 68% in recipients with nDSAs (P=.11).

In conclusion, the authors said, “This paired kidney analysis confirms that the presence of pre-transplant DSAs in deceased donor transplantations is a risk marker for graft loss, whereas nDSAs in general are not associated with a lower graft survival. Subgroup analysis indicated that only in broadly sensitized patients with nDSAs against Class I and II, nDSAs may be a risk marker for graft loss in the long-term.”