Vous êtes ici : Accueil > Espace Médecin > La revue de Presse > Transplantation

Mise jour le : 28-05-2017

Les derniers abstracts de la revue Transplantation - Current Issue :

    Date de mise en ligne : Jeudi 01 janvier 1970
    The Editors
    Highlights in Clinical Science
    No abstract available

    Date de mise en ligne : Jeudi 01 janvier 1970
    Doshi, Mona D.; Reese, Peter P.; Hall, Isaac E.; Schröppel, Bernd; Ficek, Joseph; Formica, Richard N.; Weng, Francis L.; Hasz, Rick D.; Thiessen-Philbrook, Heather; Parikh, Chirag R.
    Utility of Applying Quality Assessment Tools for Kidneys With KDPI ≥80
    imageBackground: Kidneys with “high” Kidney Donor Profile Index (KDPI) are often biopsied and pumped, yet frequently discarded. Methods: In this multicenter study, we describe the characteristics and outcomes of kidneys with KDPI of 80 or greater that were procured from 338 deceased donors. We excluded donors with anatomical kidney abnormalities. Results: Donors were categorized by the number of kidneys discarded: (1) none (n = 154, 46%), (2) 1 discarded and 1 transplanted (n = 48, 14%), (3) both discarded (n = 136, 40%). Donors in group 3 were older, more often white, and had higher terminal creatinine and KDPI than group 1 (all P < 0.05). Biopsy was performed in 92% of all kidneys, and 47% were pumped. Discard was associated with biopsy findings and first hour renal resistance. Kidney injury biomarker levels (neutrophil gelatinase-associated lipocalin, IL-18, and kidney injury molecule-1 measured from donor urine at procurement and from perfusate soon after pump perfusion) were not different between groups. There was no significant difference in 1-year estimated glomerular filtration rate or graft failure between groups 1 and 2 (41.5 ± 18 vs 41.4 ± 22 mL/min per 1.73 m2; P = 0.97 and 9% vs 10%; P = 0.76). Conclusions: Kidneys with KDPI of 80 or greater comprise the most resource consuming fraction of our donor kidney pool and have the highest rates of discard. Our data suggest that some discarded kidneys with KDPI of 80 or greater are viable; however, current tools and urine and perfusate biomarkers to identify these viable kidneys are not satisfactory. We need better methods to assess viability of kidneys with high KDPI.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Soares, Kevin C.; Arend, Lois J.; Lonze, Bonnie E.; Desai, Niraj M.; Alachkar, Nada; Naqvi, Fizza; Montgomery, Robert A.
    Successful Renal Transplantation of Deceased Donor Kidneys With 100% Glomerular Fibrin Thrombi and Acute Renal Failure Due to Disseminated Intravascular Coagulation
    imageBackground: Disseminated intravascular coagulation (DIC)-positive kidneys have historically been turned down for fear of poor outcomes. Higher severity injuries, which are prone to DIC, are typically seen in younger, otherwise healthy potential donors. The continued kidney allograft shortage has generated interest in the use of these DIC-positive grafts. There have been some reports of acceptable outcomes of renal transplantation using kidneys from donors with DIC. There are multiple clinical series demonstrating good outcomes from DIC-positive kidneys when the extent of glomeruli containing fibrin thrombi is less than 50% and donor renal function is preserved. These grafts are frequently associated with a period of delayed graft function. Methods: We report 2 transplants with kidneys from brain dead donors with known DIC. Results: Both donors had renal failure and pretransplant renal biopsies showing 100% of the glomeruli containing fibrin thrombi. The recipients experienced delayed graft function requiring hemodialysis which was discontinued on postoperative days 18 and 39 for cases 1 and 2, respectively. Both patients are now over 14 months posttransplant with stable allograft function. Conclusions: Until clearer organ selection criteria are established, caution should be exercised when considering the use of kidneys with a similar phenotype and allocation decisions made by a multidisciplinary transplant team on a case-by-case basis.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Lim, Wai H.; McDonald, Stephen P.; Russ, Graeme R.; Chapman, Jeremy R.; Ma, Maggie KM.; Pleass, Henry; Jaques, Bryon; Wong, Germaine
    Association Between Delayed Graft Function and Graft Loss in Donation After Cardiac Death Kidney Transplants—A Paired Kidney Registry Analysis
    imageBackground: Delayed graft function (DGF) is an established complication after donation after cardiac death (DCD) kidney transplants, but the impact of DGF on graft outcomes is uncertain. To minimize donor variability and bias, a paired donor kidney analysis was undertaken where 1 kidney developed DGF and the other did not develop DGF using data from the Australia and New Zealand Dialysis and Transplant Registry. Methods: Using paired DCD kidney data from the Australia and New Zealand Dialysis and Transplant Registry, we examined the association between DGF, graft and patient outcomes between 1994 and 2012 using adjusted Cox regression models. Results: Of the 74 pairs of DCD kidneys followed for a median of 1.9 years (408 person-years), a greater proportion of recipients with DGF had experienced overall graft loss and death-censored graft loss at 3 years compared with those without DGF (14% vs 4%, P = 0.04 and 11% vs 0%, P < 0.01, respectively). Compared with recipients without DGF, the adjusted hazard ratio for overall graft loss at 3 years for recipients with DGF was 4.31 (95% confidence interval [95% CI], 1.13-16.44). The adjusted hazard ratio for acute rejection and all-cause mortality at 3 years in recipients who have experienced DGF were 0.98 (95% CI, 0.96-1.01) and 1.70 (95% CI, 0.36-7.93), respectively, compared with recipients without DGF. Conclusions: Recipients of DCD kidneys with DGF experienced a higher incidence of overall and death-censored graft loss compared with those without DGF. Strategies aim to reduce the risk of DGF could potentially improve graft survival in DCD kidney transplants.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Peters-Sengers, Hessel; Homan van der Heide, Jaap J.; Heemskerk, Martin B. A.; ten Berge, Ineke J. M.; Ultee, Fred C. W.; Idu, Mirza M.; Betjes, Michiel G. H.; van Zuilen, Arjan D.; Christiaans, Maarten H. L.; Hilbrands, Luuk H.; de Vries, Aiko P. J.; Nurmohamed, Azam S.; Berger, Stefan P.; Bemelman, Frederike J.
    Similar 5-Year Estimated Glomerular Filtration Rate Between Kidney Transplants From Uncontrolled and Controlled Donors After Circulatory Death—A Dutch Cohort Study
    imageBackground: Organ shortage persists despite a high rate of donation after circulatory death (DCD) in the Netherlands. The median waiting time for a deceased donor kidney in 2013 was 3.5 years. Most DCD kidneys are from controlled DCD (cDCD; Maastricht category III). Experience with uncontrolled donors after cardiac death (uDCD), that is, donors with an unexpected and irreversible cardiac arrest (Maastricht categories I and II), is increasing; and its effect on transplant outcomes needs evaluation. Methods: We used the Dutch Organ Transplantation Registry to include recipients (≥18 years old) from all Dutch centers who received transplants from 2002 to 2012 with a first DCD kidney. We compared transplant outcome in uDCD (n = 97) and cDCD (n = 1441). Results: Primary nonfunction in uDCD was higher than in the cDCD (19.6% vs 9.6%, P < 0.001, respectively). Delayed graft function was also higher in uDCD than in cDCD, but not significantly (73.7% vs 63.3%, P = .074, respectively). If censored for primary nonfunction, estimated glomerular filtration rates after 1 year and 5 years were comparable between uDCD and cDCD (1 year: uDCD, 44.3 (23.4) mL/min/m2 and cDCD, 45.8 (24.1) mL/min/m2; P = 0.621; 5 years: uDCD, 49.1 (25.6) mL/min/m2 and cDCD, 47.7 (21.7) mL/min/m2; P = 0.686). The differences in primary nonfunction between kidneys from uDCD and cDCD were explained by differences in the first warm ischemic period, cold ischemic time, and donor age. Conclusions: We conclude that uDCD kidneys have potential for excellent function and can constitute a valuable extension of the donor pool. However, further efforts are necessary to address the high rate of primary nonfunction.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Wong, Germaine; Teixeira-Pinto, Armando; Chapman, Jeremy R.; Craig, Jonathan C.; Pleass, Henry; McDonald, Stephen; Lim, Wai H.
    The Impact of Total Ischemic Time, Donor Age and the Pathway of Donor Death on Graft Outcomes After Deceased Donor Kidney Transplantation
    imageBackground: Prolonged ischemia is a known risk factor for delayed graft function (DGF) and its interaction with donor characteristics, the pathways of donor death, and graft outcomes may have important implications for allocation policies. Methods: Using data from the Australian and New Zealand Dialysis and Transplant registry (1994-2013), we examined the relationship between total ischemic time with graft outcomes among recipients who received their first deceased donor kidney transplants. Total ischemic time (in hours) was defined as the time of the donor renal artery interruption or aortic clamp, until the time of release of the clamp on the renal artery in the recipient. Results: A total of 7542 recipients were followed up over a median follow-up time of 5.3 years (interquartile range of 8.2 years). Of these, 1823 (24.6%) experienced DGF and 2553 (33.9%) experienced allograft loss. Recipients with total ischemic time of 14 hours or longer experienced an increased odd of DGF compared with those with total ischemic time less than 14 hours. This effect was most marked among those with older donors (P value for interaction = 0.01). There was a significant interaction between total ischemic time, donor age, and graft loss (P value for interaction = 0.03). There was on average, a 9% increase in the overall risk of graft loss per hour increase in the total ischemic time (adjusted hazard ratio, 1.09; 95% confidence interval, 1.01-1.18; P = 0.02) in recipients with older donation after circulatory death grafts. Conclusions: There is a clinically important interaction between donor age, the pathway of donor death, and total ischemic time on graft outcomes, such that the duration of ischemic time has the greatest impact on graft survival in recipients with older donation after circulatory death kidneys.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Alhamad, Tarek; Malone, Andrew F.; Lentine, Krista L.; Brennan, Daniel C.; Wellen, Jason; Chang, Su-Hsin; Chakkera, Harini A.
    Selected Mildly Obese Donors Can Be Used Safely in Simultaneous Pancreas and Kidney Transplantation
    imageBackground: Donor obesity, defined as donor body mass index (D-BMI) of 30 kg/m2 or greater, has been associated with increased risk of technical failure and poor pancreas allograft outcomes. Many transplant centers establish a threshold of D-BMI of 30 kg/m2 to decline donor offers for pancreas transplantation. However, no previous studies differentiate the impact of mild (D-BMI, 30-35 kg/m2) versus severe obesity (D-BMI, ≥35 kg/m2) on pancreas allograft outcomes. Methods: We examined Organ Procurement Transplant Network database records for 9916 simultaneous pancreas-kidney transplants (SPKT) performed between 2000 and 2013. We categorized donor body mass index (D-BMI) into 4 groups: 20 to 25 (n = 5724), 25 to 30 (n = 3303), 30 to 35 (n = 751), and 35 to 50 kg/m2 (n= 138). Associations of D-BMI with pancreas and kidney allograft failure were assessed by multivariate Cox regression adjusted for recipient, donor, and transplant factors. Results: Compared with D-BMI 20 to 25 kg/m2, only D-BMI 35 to 50 kg/m2 was associated with significantly higher pancreas allograft [adjusted hazard ratio [aHR], 1.37; 95% confidence interval (CI], 1.04-1.79] and kidney allograft (aHR, 1.36; CI, 1.02-1.82) failure over the study period (13 years). Donor BMI 30 to 35 kg/m2 did not impact pancreas allograft (aHR, 0.99; CI, 0.86-1.37) or kidney allograft (aHR, 0.98; CI, 0.84-1.15) failure. Similar patterns were noted at 3 months, and 1, 5, and 10 years posttransplant. Conclusions: These data support that pancreata from mildly obese donors (BMI, 30-35 kg/m2) can be safely used for transplantation, with comparable short-term and long-term outcomes as organs from lean donors. Consideration of pancreata from obese donors may decrease the pancreas discard rate.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Gordon, Elisa J.; Sohn, Min-Woong; Chang, Chih-Hung; McNatt, Gwen; Vera, Karina; Beauvais, Nicole; Warren, Emily; Mannon, Roslyn B.; Ison, Michael G.
    Effect of a Mobile Web App on Kidney Transplant Candidates' Knowledge About Increased Risk Donor Kidneys: A Randomized Controlled Trial
    imageBackground: Kidney transplant candidates (KTCs) must provide informed consent to accept kidneys from increased risk donors (IRD), but poorly understand them. We conducted a multisite, randomized controlled trial to evaluate the efficacy of a mobile Web application, Inform Me, for increasing knowledge about IRDs. Methods: Kidney transplant candidates undergoing transplant evaluation at 2 transplant centers were randomized to use Inform Me after routine transplant education (intervention) or routine transplant education alone (control). Computer adaptive learning method reinforced learning by embedding educational material, and initial (test 1) and additional test questions (test 2) into each chapter. Knowledge (primary outcome) was assessed in person after education (tests 1 and 2), and 1 week later by telephone (test 3). Controls did not receive test 2. Willingness to accept an IRD kidney (secondary outcome) was assessed after tests 1 and 3. Linear regression test 1 knowledge scores were used to test the significance of Inform Me exposure after controlling for covariates. Multiple imputation was used for intention-to-treat analysis. Results: Two hundred eighty-eight KTCs participated. Intervention participants had higher test 1 knowledge scores (mean difference, 6.61; 95% confidence interval [95% CI], 5.37-7.86) than control participants, representing a 44% higher score than control participants' scores. Intervention participants' knowledge scores increased with educational reinforcement (test 2) compared with control arm test 1 scores (mean difference, 9.50; 95% CI, 8.27-10.73). After 1 week, intervention participants' knowledge remained greater than controls' knowledge (mean difference, 3.63; 95% CI, 2.49-4.78) (test 3). Willingness to accept an IRD kidney did not differ between study arms at tests 1 and 3. Conclusions: Inform Me use was associated with greater KTC knowledge about IRD kidneys above routine transplant education alone.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Marlais, Matko; Pankhurst, Laura; Hudson, Alex; Sharif, Khalid; Marks, Stephen D.
    UK National Registry Study of Kidney Donation After Circulatory Death for Pediatric Recipients
    imageBackground: Donation after circulatory death (DCD) kidney transplantation has acceptable renal allograft survival in adults but there are few data in pediatric recipients. The aim of this study was to determine renal allograft outcomes for pediatric recipients of a DCD kidney. Methods: Data were collected from the UK Transplant Registry held by National Health Service Blood and Transplant. Kidney transplants performed for pediatric recipients (age, <18 years) in the United Kingdom from 2000 to 2014 were separated into DCD, donation after brain death (DBD), and living donor (LD) transplants, analyzing 3-year patient and renal allograft survival. Results: One thousand seven hundred seventy-two kidney only transplants were analyzed. Twenty-one (1.2%) of these were from DCD donors, 955 (53.9%) from DBD donors, and 796 (44.9%) from LDs. Patient survival is 100% in the DCD group, 98.7% in the DBD group, and 98.9% in the LD group. Three-year renal allograft survival was 95.2% in the DCD group, 87.1% in the DBD group, and 92.9% in the LD group. There was no significant difference in 3-year renal allograft survival between the DCD and DBD groups (P = 0.42) or DCD and LD groups (P = 0.84). For DCD, the primary nonfunction rate was 5% and delayed graft function was 25%. Conclusions: Children receiving a DCD kidney transplant have good renal allograft survival at 3-year follow-up, comparable to those receiving a kidney from a DBD donor or a LD. This limited evidence encourages the use of selected DCD kidneys in pediatric transplantation, and DCD allocation algorithms may need to be reviewed in view of this.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Mucsi, Istvan; Bansal, Aarushi; Jeannette, Michael; Famure, Olusegun; Li, Yanhong; Novak, Marta; Kim, S. Joseph
    Mental Health and Behavioral Barriers in Access to Kidney Transplantation: A Canadian Cohort Study
    imageBackground: A history of mental health (MH) disorders or nonadherence (NA) may be barriers to completing the work-up (WU) and/or undergoing kidney transplantation (KT) but this has not been well documented. In this work, we analyzed the relationship between a history of MH disorders or NA and the likelihood of completing the WU or undergoing KT. Methods: Patients referred for KT to the Toronto General Hospital from January 1, 2003, to December 31, 2012, and who completed a social work assessment, were included (n = 1769). The association between the history of MH disorders or NA and the time from referral to WU completion or KT were examined using Cox proportional hazards models. Results: A history of MH disorders or NA was present in 24% and 18%, respectively. Patients with MH disorders had a 17% lower adjusted hazard of completing the WU within 2 years of referral (HR 0.83; 95% confidence interval [95% CI], 0.71-0.97). Similarly, patients with a history of NA had a 21% lower hazard of completing the WU (hazard ratio [HR], 0.79; 95% CI, 0.66-0.94). The adjusted HR for KT was 0.88 (95% CI, 0.74-1.05) and 0.79 (95% CI, 0.64-0.97) for MH disorders and NA, respectively. Conclusions: These findings suggest that a history of MH disorders or NA is a potential barrier to KT. Whether targeted psychosocial support can improve access to KT for these patients requires further study.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Freeman, Michael A.; Pleis, John R.; Bornemann, Kellee R.; Croswell, Emilee; Dew, Mary Amanda; Chang, Chung-Chou H.; Switzer, Galen E.; Langone, Anthony; Mittal-Henkle, Anuja; Saha, Somnath; Ramkumar, Mohan; Adams Flohr, Jareen; Thomas, Christie P.; Myaskovsky, Larissa
    Has the Department of Veterans Affairs Found a Way to Avoid Racial Disparities in the Evaluation Process for Kidney Transplantation?
    imageBackground: Minority groups are affected by significant disparities in kidney transplantation (KT) in Veterans Affairs (VA) and non-VA transplant centers. However, prior VA studies have been limited to retrospective, secondary database analyses that focused on multiple stages of the KT process simultaneously. Our goal was to determine whether disparities during the evaluation period for KT exist in the VA as has been found in non-VA settings. Methods: We conducted a multicenter longitudinal cohort study of 602 patients undergoing initial evaluation for KT at 4 National VA KT Centers. Participants completed a telephone interview to determine whether, after controlling for medical factors, differences in time to acceptance for transplant were explained by patients' demographic, cultural, psychosocial, or transplant knowledge factors. Results: There were no significant racial disparities in the time to acceptance for KT [Log-Rank χ2 = 1.04; P = 0.594]. Younger age (hazards ratio [HR], 0.98; 95% confidence interval [CI], 0.97-0.99), fewer comorbidities (HR, 0.89; 95% CI, 0.84-0.95), being married (HR, 0.81; 95% CI, 0.66-0.99), having private and public insurance (HR, 1.29; 95% CI, 1.03-1.51), and moderate or greater levels of depression (HR, 1.87; 95% CI, 1.03-3.29) predicted a shorter time to acceptance. The influence of preference for type of KT (deceased or living donor) and transplant center location on days to acceptance varied over time. Conclusions: Our results indicate that the VA National Transplant System did not exhibit the racial disparities in evaluation for KT as have been found in non-VA transplant centers.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Caplan, Arthur L.; Kimberly, Laura L.; Parent, Brendan; Sosin, Michael; Rodriguez, Eduardo D.
    The Ethics of Penile Transplantation: Preliminary Recommendations
    Background: For men with significant genitourinary injury, penile transplantation is being considered as an option when reconstruction is not feasible or proves unacceptable to the injured patient. Methods: A review of the literature was conducted to assess the current state of penile reconstruction and transplantation options, as well as to evaluate scholarly research addressing the ethical dimensions of penile transplantation. Results: The state of penile transplantation is elementary. If reconstruction is not a possibility, proceeding ethically with research on penile vascularized composite allotransplantation will require the articulation of guidelines. To date, very little has been published in the scholarly literature assessing the ethics of penile transplantation. Conclusions: Guidelines should be developed to address penile transplantation and must cover the donation of tissue, consent, subject selection, qualifications of the surgical team, and management of both failure and patient dissatisfaction. Unless guidelines are established and disseminated, penile transplants should not be undertaken. The preliminary recommendations suggested in this article may help to inform development of guidelines.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Taylor, Craig J.; Kosmoliaptsis, Vasilis; Martin, Jessie; Knighton, Graham; Mallon, Dermot; Bradley, J. Andrew; Peacock, Sarah
    Technical Limitations of the C1q Single-Antigen Bead Assay to Detect Complement Binding HLA-Specific Antibodies
    imageBackground: Solid-phase assays to distinguish complement binding from noncomplement binding HLA-specific antibodies have been introduced, but technical limitations may compromise their interpretation. We have examined the extent to which C1q-binding to HLA-class I single-antigen beads (SAB) is influenced by denatured HLA on SAB, antibody titre, and complement interference that causes a misleading low assessment of HLA-specific antibody levels. Methods: Sera from 25 highly sensitized patients were tested using Luminex IgG-SAB and C1q-SAB assays. Sera were tested undiluted, at 1:20 dilution to detect high-level IgG, and after ethylene diamine tetraacetic acid treatment to obviate complement interference. Conformational HLA and denatured HLA protein levels on SAB were determined using W6/32 and HC-10 monoclonal antibodies, respectively. Denatured HLA was expressed as HC-10 binding to untreated SAB as a percentage of maximal binding to acid-treated SAB. Results: For undiluted sera, Luminex mean fluorescence intensity (MFI) values for IgG-SAB and C1q-SAB correlated poorly (r2 = 0.42). ethylene diamine tetraacetic acid and serum dilution improved the correlation (r2 = 0.57 and 0.77, respectively). Increasing levels of denatured HLA interfered with the detection of C1q binding. Consequently, the correlation between IgG-SAB MFI and C1q-SAB MFI was lowest using undiluted sera and SAB with greater than 30% denatured HLA (r2 = 0.40) and highest using diluted sera and SAB with 30% or less denatured HLA (r2 = 0.86). Conclusions: Antibody level, complement interference, and denatured HLA class I on SAB may all affect the clinical interpretation of the C1q-SAB assay. The C1q-SAB assay represents a substantial additional cost for routine clinical use, and we question its justification given the potential uncertainty about its interpretation.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Reinsmoen, Nancy L.; Mirocha, James; Ensor, Christopher R.; Marrari, Marilyn; Chaux, George; Levine, Deborah J.; Zhang, Xiaohai; Zeevi, Adriana
    A 3-Center Study Reveals New Insights Into the Impact of Non-HLA Antibodies on Lung Transplantation Outcome
    imageBackground: The presence of antibodies to angiotensin type 1 receptor (AT1R) and endothelin type A receptor (ETAR) is associated with allograft rejection in kidney and heart transplantation. The aim of our study was to determine the impact of AT1R and ETAR antibodies on graft outcome in lung transplantation. Methods: Pretransplant and posttransplant sera from 162 lung recipients transplanted at 3 centers between 2011 and 2013 were tested for antibodies to AT1R and ETAR by the enzyme-linked immunosorbent assay (ELISA) assay. Clinical parameters analyzed were: HLA antibodies at transplant, de novo donor-specific antibodies (DSA), antibody-mediated rejection (AMR), acute cellular rejection, and graft status. Results: Late AMR (median posttransplant day 323) was diagnosed in 5 of 36 recipients with de novo DSA. Freedom from AMR significantly decreased for those recipients with strong/intermediate binding antibodies to AT1R (P = 0.014) and ETAR (P = 0.005). Trends for lower freedom from acute cellular rejection were observed for recipients with pretransplant antibodies to AT1R (P = 0.19) and ETAR (P = 0.32), but did not reach statistical significance. Lower freedom from the development of de novo DSA was observed for recipients with antibodies detected pretransplant to AT1R (P = 0.054), ETAR (P = 0.012), and HLA-specific antibodies (P = 0.063). When the pretransplant antibody status of HLA-specific antibody (hazard ratio [HR], 1.69) was considered together with either strong binding to AT1R or ETAR, an increased negative impact on the freedom from the development of de novo DSA was observed (HR, 2.26 for HLA antibodies and ETAR; HR, 2.38 for HLA antibodies and ETAR). Conclusions: These results illustrate the increased negative impact when antibodies to both HLA and non-HLA antigens are present pretransplant.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Moreno Gonzales, Manuel A.; Gandhi, Manish J.; Schinstock, Carrie A.; Moore, Natalie A.; Smith, Byron H.; Braaten, Nong Y.; Stegall, Mark D.
    32 Doses of Bortezomib for Desensitization Is Not Well Tolerated and Is Associated With Only Modest Reductions in Anti-HLA Antibody
    imageBackground: We previously showed that bortezomib (BTZ) partially depletes plasma cells, yet has limited efficacy for desensitization in kidney transplant candidates when up to 16 doses is given. Methods: This study aimed to determine the safety and efficacy of 32 doses of BTZ (1.3 mg/m2 of body surface area) in 10 highly sensitized kidney transplant candidates with alloantibodies against their intended living donor. Results: Dose reduction was needed in 2 patients and 2 others completely discontinued therapy for adverse events. Anti-HLA antibodies mean fluorescence intensity (MFI) values were stable prior to BTZ (P = 0.96) but decreased after therapy (mean decrease of 1916 [SE, 425] MFI, P < 0.01). No patient developed a negative crossmatch against their original intended donor, and the calculated panel-reactive antibodies based on MFI of 2000, 4000, and 8000 was unchanged in all patients. Conclusions: These data suggest that 32 doses of BTZ monotherapy was not well tolerated and resulted in only a modest reduction in anti-HLA antibodies.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Butts, Ryan; Davis, Melanie; Savage, Andrew; Burnette, Ali; Kavarana, Minoo; Bradley, Scott; Atz, Andrew; Nietert, Paul J.
    Effect of Induction Therapy on Graft Survival in Primary Pediatric Heart Transplantation: A Propensity Score Analysis of the UNOS Database
    imageBackground: The use of induction therapy in pediatric heart transplantation has increased. The aim of this study was to investigate the effects of induction therapy on graft survival. Methods: The United Network for Organ Sharing database was queried for isolated pediatric heart transplants from January 1, 1994, to December 31, 2013. Propensity scores for induction treatment were calculated by estimating probability of induction using a logistic regression model. Transplants were then matched between induction treatment groups based on the propensity score, reducing potential biases. Using only propensity score matched transplants, the effect of induction therapy on graft survival was investigated using Cox-proportional hazards. Subgroup analyses were performed based on age, race, recipient cardiac diagnosis, HLA, and recipient panel-reactive antibody (PRA). Results: Of 4565 pediatric primary heart transplants from 1994 to 2013, 3741 had complete data for the propensity score calculation. There were 2792 transplants successfully matched (induction, n = 1396; no induction, n = 1396). There were no significant differences in transplant and pretransplant covariates between induction and no induction groups. In the Cox-proportional hazards model, the use of induction of was not associated with graft loss (hazard ratio [HR], 0.88; 95% confidence interval [95% CI], 0.75-1.01; P = 0.07). In subgroup analyses, induction therapy may be associated with improved survival in patients with PRA greater than 50% (HR, 0.57; 95% CI, 0.34-0.97) and congenital heart disease (HR, 0.78; 95% CI, 0.64-0.96). Conclusions: Induction therapy is not associated with improved graft survival in primary pediatric heart transplantation. However, in pediatric heart transplant recipients with PRA greater than 50% or congenital heart disease, induction therapy is associated with improved survival.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Gharibi, Zahra; Ayvaci, Mehmet U.S.; Hahsler, Michael; Giacoma, Tracy; Gaston, Robert S.; Tanriover, Bekir
    Cost-Effectiveness of Antibody-Based Induction Therapy in Deceased Donor Kidney Transplantation in the United States
    imageBackground: Induction therapy in deceased donor kidney transplantation is costly, with wide discrepancy in utilization and a limited evidence base, particularly regarding cost-effectiveness. Methods: We linked the United States Renal Data System data set to Medicare claims to estimate cumulative costs, graft survival, and incremental cost-effectiveness ratio (ICER – cost per additional year of graft survival) within 3 years of transplantation in 19 450 deceased donor kidney transplantation recipients with Medicare as primary payer from 2000 to 2008. We divided the study cohort into high-risk (age > 60 years, panel-reactive antibody > 20%, African American race, Kidney Donor Profile Index > 50%, cold ischemia time > 24 hours) and low-risk (not having any risk factors, comprising approximately 15% of the cohort). After the elimination of dominated options, we estimated expected ICER among induction categories: no-induction, alemtuzumab, rabbit antithymocyte globulin (r-ATG), and interleukin-2 receptor-antagonist. Results: No-induction was the least effective and most costly option in both risk groups. Depletional antibodies (r-ATG and alemtuzumab) were more cost-effective across all willingness-to-pay thresholds in the low-risk group. For the high-risk group and its subcategories, the ICER was very sensitive to the graft survival; overall both depletional antibodies were more cost-effective, mainly for higher willingness to pay threshold (US $100 000 and US $150 000). Rabbit ATG appears to achieve excellent cost-effectiveness acceptability curves (80% of the recipients) in both risk groups at US $50 000 threshold (except age > 60 years). In addition, only r-ATG was associated with graft survival benefit over no-induction category (hazard ratio, 0.91; 95% confidence interval, 0.84-0.99) in a multivariable Cox regression analysis. Conclusions: Antibody-based induction appears to offer substantial advantages in both cost and outcome compared with no-induction. Overall, depletional induction (preferably r-ATG) appears to offer the greatest benefits.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Stojanovic, Jelena; Adamusiak, Anna; Kessaris, Nicos; Chandak, Pankaj; Ahmed, Zubir; Sebire, Neil J.; Walsh, Grainne; Jones, Helen E.; Marks, Stephen D.; Mamode, Nizam
    Immune Desensitization Allows Pediatric Blood Group Incompatible Kidney Transplantation
    imageBackground: Blood group incompatible transplantation (ABOi) in children is rare as pretransplant conditioning remains challenging and concerns persist about the potential increased risk of rejection. Methods: We describe the results of 11 ABOi pediatric renal transplant recipients in the 2 largest centers in the United Kingdom, sharing the same tailored desensitization protocol. Patients with pretransplant titers of 1 or more in 8 received rituximab 1 month before transplant; tacrolimus and mycophenolate mofetil were started 1 week before surgery. Antibody removal was performed to reduce titers to 1 or less in 8 on the day of the operation. No routine postoperative antibody removal was performed. Results: Death-censored graft survival at last follow-up was 100% in the ABOi and 98% in 50 compatible pediatric transplants. One patient developed grade 2A rejection successfully treated with antithymocyte globulin. Another patient had a titer rise of 2 dilutions treated with 1 immunoadsorption session. There was no histological evidence of rejection in the other 9 patients. One patient developed cytomegalovirus and BK and 2 others EBV and BK viremia. Conclusions: Tailored desensitization in pediatric blood group incompatible kidney transplantation results in excellent outcomes with graft survival and rejection rates comparable with compatible transplants.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Kopp, Wouter; van Meel, Marieke; Putter, Hein; Samuel, Undine; Arbogast, Helmut; Schareck, Wolfgang; Ringers, Jan; Braat, Andries
    Center Volume Is Associated With Outcome After Pancreas Transplantation Within the Eurotransplant Region
    imageBackground: Outcome after surgery depends on several factors, among these, the annual volume-outcome relationship. This might also be the case in a highly complex field as pancreas transplantation. No study has investigated this relationship in a European setting. Methods: All consecutive pancreas transplantations from January 2008 until December 2013 were included. Donor-, recipient-, and transplant-related factors were analyzed for their association with patient and graft survivals. Centers were classified in equally sized groups as being low volume (<5 transplantations on average each year in the 5 preceding years), medium volume (5-13/year), or high volume (≥13/year). Results: In the study period, 1276 pancreas transplantations were included. Unadjusted 1-year patient survival was associated with center volume and was best in high volume centers, compared with medium and low volume: 96.5%, 94% and 92.3%, respectively (P = 0.017). Pancreas donor risk index (PDRI) was highest in high volume centers: 1.38 versus 1.21 in medium and 1.25 in low volume centers (P < 0.001). Pancreas graft survival at 1 year did not differ significantly between volume categories: 86%, 83.2%, and 81.6%, respectively (P = 0.114). After multivariate Cox-regression analysis, higher PDRI (hazard ratio [HR], 1.60; P < 0.001), retransplantation (HR, 1.91; P = 0.002), and higher recipient body mass index (HR, 1.04; P = 0.024) were risk factors for pancreas graft failure. High center volume was protective for graft failure (HR, 0.70; P = 0.037) compared with low center volume. Conclusion: Patient and graft survival after pancreas transplantation are superior in higher volume centers. High volume centers have good results, even though they transplant organs with the highest PDRI.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Shin, Sung; Jung, Chang Hee; Choi, Ji Yoon; Kwon, Hyun Wook; Jung, Joo Hee; Kim, Young Hoon; Han, Duck Jong
    Long-term Metabolic Outcomes of Functioning Pancreas Transplants in Type 2 Diabetic Recipients
    imageBackground: Limited data are available regarding the long-term metabolic outcomes of functioning pancreas transplants in patients with type 2 diabetes mellitus (T2DM). Methods: To compare the long-term effects of pancreas transplantation in terms of insulin resistance and β cell function, comparison of metabolic variables was performed between type 1 diabetes mellitus (T1DM) and T2DM patients from 1-month posttransplant to 5 years using generalized, linear-mixed models for repeated measures. Results: Among 217 consecutive patients who underwent pancreas transplantation at our center between August 2004 and January 2015, 193 patients (151 T1DM and 42 T2DM) were included in this study. Throughout the follow-up period, postoperative hemoglobin A1c did not differ significantly between T1DM and T2DM patients, and the levels were constantly below 6% (42 mmol/mol) until 5 years posttransplant, whereas C-peptide was significantly higher in T2DM (P = 0.014). There was no difference in fasting insulin, homeostasis model assessment (HOMA) of insulin resistance, HOMA β cell, or the insulinogenic index between the groups. Furthermore, fasting insulin and HOMA-insulin resistance steadily decreased in both groups during the follow-up period. Conclusions: There was no significant difference in the insulin resistance or β-cell function after pancreas transplantation between T1DM and T2DM patients. We demonstrated that pancreas transplantation is capable of sustaining favorable endocrine functions for more than 5 years in T2DM recipients.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Lindahl, Jørn Petter; Massey, Richard John; Hartmann, Anders; Aakhus, Svend; Endresen, Knut; Günther, Anne; Midtvedt, Karsten; Holdaas, Hallvard; Leivestad, Torbjørn; Horneland, Rune; Øyen, Ole; Jenssen, Trond
    Cardiac Assessment of Patients With Type 1 Diabetes Median 10 Years After Successful Simultaneous Pancreas and Kidney Transplantation Compared With Living Donor Kidney Transplantation
    imageBackground: In recipients with type 1 diabetes, we aimed to determine whether long-term normoglycemia achieved by successful simultaneous pancreas and kidney (SPK) transplantation could beneficially affect progression of coronary artery disease (CAD) when compared with transplantation of a kidney-alone from a living donor (LDK). Methods: In 42 kidney transplant recipients with functioning grafts who had received either SPK (n = 25) or LDK (n = 17), we studied angiographic progression of CAD between baseline (pretransplant) and follow-up at 7 years or older. In addition, computed tomography scans for measures of coronary artery calcification and echocardiographic assessment of left ventricular systolic function were addressed at follow-up. Results: During a median follow-up time of 10.1 years (interquartile range [IQR], 9.1-11.5) progression of CAD occurred at similar rates (10 of 21 cases in the SPK and 5 of 14 cases in the LDK group; P = 0.49). Median coronary artery calcification scores were high in both groups (1767 [IQR, 321-4035] for SPK and 1045 [IQR, 807-2643] for LDK patients; P = 0.59). Left ventricular systolic function did not differ between the 2 groups. The SPK and LDK recipients were similar in age (41.2 ± 6.9 years vs 40.5 ± 10.3 years; P = 0.80) and diabetes duration at engraftment but with significant different mean HbA1c levels of 5.5 ± 0.4% for SPK and 8.3 ± 1.5% for LDK patients (P < 0.001) during follow-up. Conclusions: In patients with both type 1 diabetes and end-stage renal disease, SPK recipients had similar progression of CAD long-term compared with LDK recipients. Calcification of coronary arteries is a prominent feature in both groups long-term posttransplant.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Holmes-Walker, Deborah Jane; Gunton, Jenny E; Hawthorne, Wayne; Payk, Marlene; Anderson, Patricia; Donath, Susan; Loudovaris, Tom; Ward, Glenn M; Kay, Thomas WH; O'Connell, Philip J
    Islet Transplantation Provides Superior Glycemic Control With Less Hypoglycemia Compared With Continuous Subcutaneous Insulin Infusion or Multiple Daily Insulin Injections
    imageBackground: The aim was to compare efficacy of multiple daily injections (MDI), continuous subcutaneous insulin infusion (CSII) and islet transplantation to reduce hypoglycemia and glycemic variability in type 1 diabetes subjects with severe hypoglycemia. Methods: This was a within-subject, paired comparison of MDI and CSII and CSII with 12 months postislet transplantation in 10 type 1 diabetes subjects referred with severe hypoglycemia, suitable for islet transplantation. Individuals were assessed with HbA1c, Edmonton Hypoglycemia Score (HYPOscore), continuous glucose monitoring (CGM) and in 8 subjects measurements of glucose variability using standard deviation of glucose (SD glucose) from CGM and continuous overlapping net glycemic action using a 4 hour interval (CONGA4). Results: After changing from MDI to CSII before transplantation, 10 subjects reduced median HYPOscore from 2028 to 1085 (P < 0.05) and hypoglycemia events from 24 to 8 per patient-year (P < 0.05). While HbA1c, mean glucose and median percent time hypoglycemic on CGM were unchanged with CSII, SD glucose and CONGA4 reduced significantly (P < 0.05). At 12 months posttransplant 9 of 10 were C-peptide positive, (5 insulin independent). Twelve months postislet transplantation, there were significant reductions in all baseline parameters versus CSII, respectively, HbA1c (6.4% cf 8.2%), median HYPOscore (0 cf 1085), mean glucose (7.1 cf 8.6 mmol L−1), SD glucose (1.7 cf 3.2 mmol/L), and CONGA4 (1.6 cf 3.0). Conclusions: In subjects with severe hypoglycemia suitable for islet transplantation, CSII decreased hypoglycemia frequency and glycemic variability compared with MDI whereas islet transplantation resolved hypoglycemia and further improved glycemic variability regardless of insulin independence.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Lam, Ngan N.; Schnitzler, Mark A.; Segev, Dorry L.; Hess, Gregory P.; Kasiske, Bertram L.; Randall, Henry B.; Axelrod, David; Xiao, Huiling; Garg, Amit X.; Brennan, Daniel C.; Lentine, Krista L.
    Diabetes Mellitus in Living Pancreas Donors: Use of Integrated National Registry and Pharmacy Claims Data to Characterize Donation-Related Health Outcomes
    imageBackground: Living donor pancreas transplant is a potential treatment for diabetic patients with end-organ complications. Although early surgical risks of donation have been reported, long-term medical outcomes in living pancreas donors are not known. Methods: We integrated national Scientific Registry of Transplant Recipients data (1987-2015) with records from a nationwide pharmacy claims warehouse (2005-2015) to examine prescriptions for diabetes medications and supplies as a measure of postdonation diabetes mellitus. To compare outcomes in controls with baseline good health, we matched living pancreas donors to living kidney donors (1:3) by demographic traits and year of donation. Results: Among 73 pancreas donors in the study period, 45 were identified in the pharmacy database: 62% women, 84% white, and 80% relatives of the recipient. Over a mean postdonation follow-up period of 16.3 years, 26.7% of pancreas donors filled prescriptions for diabetes treatments, compared with 5.9% of kidney donors (odds ratio, 4.13; 95% confidence interval, 1.91-8.93; P = 0.0003). Use of insulin (11.1% vs 0%) and oral agents (20.0% vs 5.9%; odds ratio, 4.50, 95% confidence interval, 2.09-9.68; P = 0.0001) was also higher in pancreas donors. Conclusions: Diabetes is more common after living pancreas donation than after living kidney donation, supporting clinical consequences from reduced endocrine reserve.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Eide, Ivar Anders; Halden, Thea Anine Strøm; Hartmann, Anders; Dahle, Dag Olav; Åsberg, Anders; Jenssen, Trond
    Associations Between Posttransplantation Diabetes Mellitus and Renal Graft Survival
    imageBackground: Previous reports indicate that posttransplantation diabetes mellitus (PTDM) is associated with overall renal graft loss, but not death-censored graft loss. Methods: In this single-center retrospective cohort study of 2749 adult Norwegian renal transplant recipients, transplanted between 1999 and 2011, we estimated overall and death-censored renal graft loss hazard ratios in patients diagnosed with PTDM, impaired glucose tolerance and diabetes before transplantation, using multivariable Cox proportional hazard regression analysis. Results: A total of 893 renal grafts were lost during the study period, either due to recipient death (n = 540) or death-censored graft loss (n = 353). When the observational time started at time of transplantation, diabetes before transplantation was associated with both overall and death-censored graft loss. Pretransplantation diabetes was also associated with a steeper decline in renal graft function, a higher risk of acute rejections and more renal grafts lost due to acute rejection. In patients with a functional renal graft 1 year after transplantation, PTDM was associated with overall graft loss (hazard ratio, 1.46; 95% confidence interval, 1.13-1.88; P < 0.001), but not death-censored graft loss (hazard ratio, 1.25; 95% confidence interval, 0.80-1.96; P = 0.33). We found no significant associations between PTDM and change in renal function during the first 5 years or acute rejection risk during the first year after renal transplantation. Impaired glucose tolerance was not associated with either overall or death-censored graft loss. Conclusions: The present study confirms previous findings of an increased risk of overall but not death-censored renal graft loss in renal transplant recipients with PTDM. Longstanding diabetes might increase the risk of acute rejections.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Pérez-Sáez, María José; Herrera, Sabina; Prieto-Alhambra, Daniel; Nogués, Xavier; Vera, María; Redondo-Pachón, Dolores; Mir, Marisa; Güerri, Roberto; Crespo, Marta; Díez-Pérez, Adolfo; Pascual, Julio
    Bone Density, Microarchitecture, and Tissue Quality Long-term After Kidney Transplant
    imageBackground: Bone mineral density (BMD) measured by dual-energy x-ray absorptiometry is used to assess bone health in kidney transplant recipients (KTR). Trabecular bone score and in vivo microindentation are novel techniques that directly measure trabecular microarchitecture and mechanical properties of bone at a tissue level and independently predict fracture risk. We tested the bone status of long-term KTR using all 3 techniques. Methods: Cross-sectional study including 40 KTR with more than 10 years of follow-up and 94 healthy nontransplanted subjects as controls. Bone mineral density was measured at lumbar spine and the hip. Trabecular bone score was measured by specific software on the dual-energy x-ray absorptiometry scans of lumbar spine in 39 KTR and 77 controls. Microindentation was performed at the anterior tibial face with a reference-point indenter device. Bone measurements were standardized as percentage of a reference value, expressed as bone material strength index (BMSi) units. Multivariable (age, sex, and body mass index-adjusted) linear regression models were fitted to study the association between KTR and BMD/BMSi/trabecular bone score. Results: Bone mineral density was lower at lumbar spine (0.925 ± 0.15 vs 0.982 ± 0.14; P = 0.025), total hip (0.792 ± 0.14 vs 0.902 ± 0.13; P < 0.001), and femoral neck (0.667 ± 0.13 vs 0.775 ± 0.12; P < 0.001) in KTR than in controls. BMSi was also lower in KTR (79.1 ± 7.7 vs 82.9 ± 7.8; P = 0.012) although this difference disappeared after adjusted model (P = 0.145). Trabecular bone score was borderline lower (1.21 ± 0.14 vs 1.3 ± 0.15; adjusted P = 0.072) in KTR. Conclusions: Despite persistent decrease in BMD, trabecular microarchitecture and tissue quality remain normal in long-term KTR, suggesting important recovery of bone health.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Hellström, Vivan; Lorant, Tomas; Döhler, Bernd; Tufveson, Gunnar; Enblad, Gunilla
    High Posttransplant Cancer Incidence in Renal Transplanted Patients With Pretransplant Cancer
    imageBackground: Patients with previous cancer have increasingly been accepted for renal transplantation. Posttransplant cancer risk and survival rates of these patients are unknown. Our objective was to assess the risk of posttransplant cancer in this patient group. Methods: In this retrospective, nested case-control study, we assessed the outcome of all (n = 95) renal transplanted patients with pretransplant cancer diagnoses in the Uppsala-Örebro region, Sweden. The control group was obtained from the Collaborative Transplant Study registry and included European patients without pretransplant cancer. The other control group comprised the entire renal transplanted population in Uppsala. Development of recurrent cancer, de novo cancer, and patient survival were determined. Results: Patients with pretransplant cancer showed higher incidence of posttransplant cancers and shorter survival compared with the control groups (P < 0.001). No obvious pattern in malignant diagnoses was observed. Death-censored graft survival was unaffected. Conclusions: Despite previously adequate cancer treatments and favorable prognoses, almost half of the patients experienced a posttransplant cancer. These observations do not justify abstaining from transplanting all patients with previous malignancies, because more than 50% of the patients survive 10 years posttransplantation. A careful oncological surveillance pretransplant as well as posttransplant is recommended.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Kang, Woosun; Sampaio, Marcelo Santos; Huang, Edmund; Bunnapradist, Suphamai
    Association of Pretransplant Skin Cancer With Posttransplant Malignancy, Graft Failure and Death in Kidney Transplant Recipients
    imageBackground: Posttransplant malignancy (PTM) is one of the leading causes of late death in kidney recipients. Those with a cancer history may be more prone to develop a recurrent or a new cancer. We studied the association between pretransplant skin cancer, PTM, death, and graft failure. Methods: Primary adult kidney recipients transplanted between 2005 and 2013 were included. Malignancy information was obtained from Organ Procurement Kidney Transplant Network/United Network for Organ Sharing registration and follow-up forms. Posttransplant malignancy was classified into skin cancer, solid tumor, and posttransplant lymphoproliferative disorder (PTLD). Competing risk and survival analysis with adjustment for confounders were used to calculate risk for PTM, death and graft failure in recipients with pretransplant skin cancer compared with those without cancer. Risk was reported in hazard ratios (HR) with 95% confidence interval (CI). Results: The cohort included 1671 recipients with and 102 961 without pretransplant skin malignancy. The 5-year cumulative incidence of PTM in patients with and without a pretransplant skin cancer history was 31.6% and 7.4%, respectively (P < 0.001). Recipients with pretransplant skin cancer had increased risk of PTM (sub-HR [SHR], 2.60; 95% CI, 2.27-2.98), and posttransplant skin cancer (SHR, 2.92; 95% CI, 2.52-3.39), PTLD (SHR, 1.93; 95% CI, 1.01-3.66), solid tumor (SHR, 1.44; 95% CI, 1.04-1.99), death (HR, 1.20; 95% CI, 1.07-1.34), and graft failure (HR, 1.17; 95% CI, 1.05-1.30) when compared with those without pretransplant malignancy. Conclusions: Pretransplant skin cancer was associated with an increased risk of posttransplant skin cancer, PTLD, solid organ cancer, death and graft failure.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Watanabe, Takuya; Seguchi, Osamu; Yanase, Masanobu; Fujita, Tomoyuki; Murata, Yoshihiro; Sato, Takuma; Sunami, Haruki; Nakajima, Seiko; Kataoka, Yu; Nishimura, Kunihiro; Hisamatsu, Eriko; Kuroda, Kensuke; Okada, Norihiro; Hori, Yumiko; Wada, Kyoichi; Hata, Hiroki; Ishibashi-Ueda, Hatsue; Miyamoto, Yoshihiro; Fukushima, Norihide; Kobayashi, Junjiro; Nakatani, Takeshi
    Donor-Transmitted Atherosclerosis Associated With Worsening Cardiac Allograft Vasculopathy After Heart Transplantation: Serial Volumetric Intravascular Ultrasound Analysis
    imageBackground: The influence of preexisting donor-transmitted atherosclerosis (DA) on cardiac allograft vasculopathy (CAV) development remains unclear. Methods: We performed 3-dimensional intravascular ultrasound (3D-IVUS) analysis in 42 heart transplantation (HTx) recipients at 2.1 ± 0.9 months (baseline) and 12.2 ± 0.4 months post-HTx, as well as consecutive 3D-IVUS analyses up to 3 years post-HTx in 35 of the 42 recipients. Donor-transmitted atherosclerosis was defined as a maximal intimal thickness of 0.5 mm or greater at baseline. Changes in volumetric IVUS parameters were compared in recipients with (DA group) and without DA (DA-free group) at baseline, 1 year, and 3 years post-HTx. Results: Donor-transmitted atherosclerosis was observed in 57.1% of 42 recipients. The DA group exhibited a significantly greater increase in plaque volume at 1 year post-HTx (P < 0.001), leading to increased percent plaque volume (plaque volume/vessel volume, [%]) (P < 0.001) and decreased luminal volume (P = 0.021). Donor-transmitted atherosclerosis was independently associated with a greater increase in percent plaque volume during the first post-HTx year (P = 0.011). From 1 to 3 years post-HTx, the DA group underwent continuous reduction in luminal volume (P = 0.022). These changes resulted in a higher incidence of angiographic CAV at 3 years post-HTx in the DA group (58.8% vs 5.6%, P = 0.002). Conclusions: This volumetric IVUS study suggests that DA correlates with the worsening change in CAV several years post-HTx. Donor-transmitted atherosclerosis recipients may require more aggressive treatment to prevent subsequent CAV progression.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Hernández, Domingo; Castro de la Nuez, Pablo; Muriel, Alfonso; Ruiz-Esteban, Pedro; Rudas, Edisson; González-Molina, Miguel; Burgos, Dolores; Cabello, Mercedes; Palma, Eulalia; Gutiérrez, Elena; Alonso, Manuel
    Peripheral Vascular Disease and Death in Southern European Kidney Transplant Candidates: A Competing Risk Modeling Approach
    imageBackground: The association between peripheral vascular disease (PVD) and survival among kidney transplant (KT) candidates is uncertain. Methods: We assessed 3851 adult KT candidates from the Andalusian Registry between 1984 and 2012. Whereas 1975 patients received a KT and were censored, 1876 were on the waiting list at any time. Overall median waitlist time was 21.2 months (interquartile range, 11-37.4). We assessed the association between PVD and mortality in waitlisted patients using a multivariate Cox regression model, with a competing risk approach as a sensitivity analysis. Results: Peripheral vascular disease existed in 308 KT candidates at waitlist entry. The prevalence of PVD among nondiabetic and diabetic patients was 4.5% and 25.3% (P < 0.0001). All-cause mortality was higher in candidates with PVD (45% vs 21%; P < 0.0001). Among patients on the waiting list (n = 1876) who died (n = 446; 24%), 272 (61%) died within 2 years after listing. Cumulative incidence of all-cause mortality at 2 years in patients with and without PVD was 23% and 6.4%, respectively (P < 0.0001); similar differences were observed in patients with and without diabetes. By competing risk models, PVD was associated with a 1.9-fold increased risk of mortality (95% confidence interval [95% CI], 1.4-2.5). This association was stronger in waitlisted patients without cardiac disease (subhazard ratio, 2.2; 95% CI, 1.6-3.1) versus those with cardiac disorders (subhazard ratio, 1.5; 95% CI, 0.9-2.5). No other significant interactions were observed. Similar results were seen after excluding diabetics. Conclusions: Peripheral vascular disease is a strong predictor of mortality in KT candidates. Identification of PVD at list entry may contribute to optimize targeted therapeutic interventions and help prioritize high-risk KT candidates.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Kaboré, Rémi; Couchoud, Cécile; Macher, Marie-Alice; Salomon, Rémi; Ranchin, Bruno; Lahoche, Annie; Roussey-Kesler, Gwenaelle; Garaix, Florentine; Decramer, Stéphane; Pietrement, Christine; Lassalle, Mathilde; Baudouin, Véronique; Cochat, Pierre; Niaudet, Patrick; Joly, Pierre; Leffondré, Karen; Harambat, Jérôme
    Age-Dependent Risk of Graft Failure in Young Kidney Transplant Recipients
    imageBackground: The risk of graft failure in young kidney transplant recipients has been found to increase during adolescence and early adulthood. However, this question has not been addressed outside the United States so far. Our objective was to investigate whether the hazard of graft failure also increases during this age period in France irrespective of age at transplantation. Methods: Data of all first kidney transplantation performed before 30 years of age between 1993 and 2012 were extracted from the French kidney transplant database. The hazard of graft failure was estimated at each current age using a 2-stage modelling approach that accounted for both age at transplantation and time since transplantation. Hazard ratios comparing the risk of graft failure during adolescence or early adulthood to other periods were estimated from time-dependent Cox models. Results: A total of 5983 renal transplant recipients were included. The risk of graft failure was found to increase around the age of 13 years until the age of 21 years, and decrease thereafter. Results from the Cox model indicated that the hazard of graft failure during the age period 13 to 23 years was almost twice as high as than during the age period 0 to 12 years, and 25% higher than after 23 years. Conclusions: Among first kidney transplant recipients younger than 30 years in France, those currently in adolescence or early adulthood have the highest risk of graft failure.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Harrison, Jennifer J.; Badr, Souzi; Hamandi, Bassem; Kim, Sang Joseph
    Randomized Controlled Trial of a Computer-Based Education Program in the Home for Solid Organ Transplant Recipients: Impact on Medication Knowledge, Satisfaction, and Adherence
    imageBackground: De novo solid organ transplant recipients (SOTR) have a steep learning curve to acquire medication knowledge. Without adequate knowledge, SOTR are at risk of nonadherence and poor transplant outcomes. Methods: In this nonblinded, randomized controlled trial, de novo SOTR received standard teaching with or without postdischarge computer-based education (CBE) at home. Primary outcomes were change in knowledge (quiz and recall) and satisfaction, assessed by questionnaires at baseline and 3 months. Adherence was evaluated via self-report and immunosuppressant levels. Results: Two hundred forty-six patients were randomized and 209 completed the 3-month analysis. In the intervention arm, 73 (57.9%) used the CBE program. Change in knowledge quiz score did not differ between groups (4.9% vs 0.6%; P = 0.084), despite a significant increase within the intervention (72.4% vs 77.3%, P = 0.007) but not the control (76.0% vs 76.6%, P = 0.726) arms. Both groups had a significant improvement in recall (intervention, 56.7% vs 82.1%, P < 0.001; control, 51.3% vs 79.7%, P < 0.001), with similar changes in scores (25.4% vs 28.4%, P = 0.55). Change in satisfaction differed between groups (intervention, 1.2% vs control, −4.9%; P = 0.050). There was a significant decline in satisfaction within the control group (88.4% vs 83.5%, P = 0.035), whereas satisfaction was maintained with the intervention (85.6% vs 86.8%, P = 0.55). Adherence was similar in both groups. Conclusions: Knowledge improved over the study period in both groups, with no incremental benefit for the intervention. Patient satisfaction was maintained with the CBE program. More research is needed to identify barriers to uptake of CBE at home and to develop effective strategies for posttransplant education.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Frank, Susan J.; Walter, William R.; Latson, Larry Jr; Cohen, Hillel W.; Koenigsberg, Mordecai
    New Dimensions in Renal Transplant Sonography: Applications of 3-Dimensional Ultrasound
    imageBackground: The aim of this study is to demonstrate the usefulness of adding 3-dimensional (3D) ultrasound in evaluation of renal transplant vasculature compared to 2-dimensional (2D) Duplex ultrasound. Methods: One hundred thirteen consecutive renal transplant 2D and 3D ultrasound examinations were performed and retrospectively reviewed by 2 board-certified radiologists and a radiology resident individually; each reviewed 2D and then 3D images, including color and spectral Doppler. They recorded ability to visualize the surgical anastomosis and rated visualization on a subjective scale. Interobserver agreement was evaluated. Variant anastomosis anatomy was recorded. Tortuosity or stenosis was evaluated if localized Doppler velocity elevation was present. Results: The reviewers directly visualized the anastomosis more often with 3D ultrasound ( =97.5%) compared with 2D ( =54.5%) [difference in means (DM) = 43% (95% confidence interval (CI) = 36%-50%) (P < 0.001)]. The reviewers visualized the anastomosis more clearly with 3D ultrasound (P < 0.001) [difference in medians = 0.5, 1.0, and 1.0, (95% CI = 0.5-1.0, 0.5-1.0, and 1.0-1.5)]. Detection of variant anatomy improved with 3D ultrasound by 2 reviewers [DM = 7.1% and 8.9% (95% CI = 1%-13% and 4%-14%, respectively) (P < 0.05)]. There was high interobserver agreement [ = 95.3%, (95% CI = 91.9%-98.7%) regarding anastomosis visualization among reviewers with wide-ranging experience. Conclusions: Direct visualization of the entire anastomosis was improved with 3D ultrasound. Three-dimensional evaluation improved detection of anatomic variants and identified tortuosity as the likely cause of borderline localized elevation in Doppler velocity. The data added by 3D ultrasound may obviate confirmatory testing with magnetic resonance angiography or computed tomographic angiography after equivocal 2D ultrasound results.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Molnar, Miklos Z.; Nguyen, Danh V.; Chen, Yanjun; Ravel, Vanessa; Streja, Elani; Krishnan, Mahesh; Kovesdy, Csaba P.; Mehrotra, Rajnish; Kalantar-Zadeh, Kamyar
    Predictive Score for Posttransplantation Outcomes
    imageBackground: Most current scoring tools to predict allograft and patient survival upon kidney transplantion are based on variables collected posttransplantation. We developed a novel score to predict posttransplant outcomes using pretransplant information including routine laboratory data available before or at the time of transplantation. Methods: Linking the 5-year patient data of a large dialysis organization to the Scientific Registry of Transplant Recipients, we identified 15 125 hemodialysis patients who underwent first deceased transplantion. Prediction models were developed using Cox models for (a) mortality, (b) allograft loss (death censored), and (c) combined death or transplant failure. The cohort was randomly divided into a two thirds set (Nd = 10 083) for model development and a one third set (Nv = 5042) for validation. Model predictive discrimination was assessed using the index of concordance, or C statistic, which accounts for censoring in time-to-event models (a-c). We used the bootstrap method to assess model overfitting and calibration using the development dataset. Results: Patients were 50 ± 13 years of age and included 39% women, 15% African Americans, and 36% persons with diabetes. For prediction of posttransplant mortality and graft loss, 10 predictors were used (recipients' age, cause and length of end-stage renal disease, hemoglobin, albumin, selected comorbidities, race and type of insurance as well as donor age, diabetes status, extended criterion donor kidney, and number of HLA mismatches). The new model (www.TransplantScore.com ) showed the overall best discrimination (C-statistics, 0.70; 95% confidence interval [95% CI], 0.67-0.73 for mortality; 0.63; 95% CI, 0.60-0.66 for graft failure; 0.63; 95% CI, 0.61-0.66 for combined outcome). Conclusions: The new prediction tool, using data available before the time of transplantation, predicts relevant clinical outcomes and may perform better to predict patients' graft survival than currently used tools.

    Date de mise en ligne : Jeudi 01 janvier 1970
    David-Neto, Elias; Romano, Paschoalina; Kamada Triboni, Ana Heloisa; Ramos, Fernanda; Agena, Fabiana; Almeida Rezende Ebner, Persio; Altona, Marcelo; Zocoler Galante, Nelson; Brambate Carvalhinho Lemos, Francine
    Longitudinal Pharmacokinetics of Tacrolimus in Elderly Compared With Younger Recipients in the First 6 Months After Renal Transplantation
    imageBackground: Elderly (Eld) (≥60 years) recipients are receiving renal transplants more frequently. The pharmacokinetics (PK) studies of immunosuppressive drugs in healthy volunteers, rarely, include old patients. Methods: We studied 208 12-hour tacrolimus (TAC) PK (0, 20, 40, 60, 90, 120, 180, 240, 360, 480, 600, 720 min) in 44 Eld (65 ± 3 years) and compared the results with 31 younger controls (Ctrl) (35 ± 6 years) recipients, taking oral TAC/mycophenolate sodium (MPS)/prednisone, at 4 different timepoints: PK1 (8 ± 2 days; n = 72), PK2 (31 ± 4 days; n = 61), PK3 (63 ± 6 days; n = 44), and PK4 (185 ± 10 days; n = 31). Tacrolimus PK was measured by ultraperformance liquid chromatography coupled to a mass spectrometer repetition and noncompartmental PKs were analyzed using Phoenix WinNonlin. Results: Mean TAC dose was lower in the Eld group than in Ctrl ones throughout timepoints either by total daily dose or adjusted (Adj) per body weight. Mean TAC trough level (Cmin), used to adjust daily dose, was not different between the 2 groups in all timepoints. AdjCmax and AdjTAC-area under the curve at dosing interval were both higher in the Eld compared to the Ctrl group in PKs1, 3, and 4. Estimated total body clearance normalized by dose and weight was lower in the Eld group compared with the Ctrl in all PKs and statistically lower at PKs 1 and 3. Similar to younger recipients TAC trough level has also a high correlation (R2 = 0.76) with area under the curve at dosing interval. Conclusions: These data indicate that Eld recipients have a lower TAC clearance and therefore need a lower TAC dose than younger recipients.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Schold, Jesse D.; Miller, Charles M.; Henry, Mitchell L.; Buccini, Laura D.; Flechner, Stuart M.; Goldfarb, David A.; Poggio, Emilio D.; Andreoni, Kenneth A.
    Evaluation of Flagging Criteria of United States Kidney Transplant Center Performance: How to Best Define Outliers?
    imageBackground: Scientific Registry of Transplant Recipients report cards of US organ transplant center performance are publicly available and used for quality oversight. Low center performance (LP) evaluations are associated with changes in practice including reduced transplant rates and increased waitlist removals. In 2014, Scientific Registry of Transplant Recipients implemented new Bayesian methodology to evaluate performance which was not adopted by Center for Medicare and Medicaid Services (CMS). In May 2016, CMS altered their performance criteria, reducing the likelihood of LP evaluations. Methods: Our aims were to evaluate incidence, survival rates, and volume of LP centers with Bayesian, historical (old-CMS) and new-CMS criteria using 6 consecutive program-specific reports (PSR), January 2013 to July 2015 among adult kidney transplant centers. Results: Bayesian, old-CMS and new-CMS criteria identified 13.4%, 8.3%, and 6.1% LP PSRs, respectively. Over the 3-year period, 31.9% (Bayesian), 23.4% (old-CMS), and 19.8% (new-CMS) of centers had 1 or more LP evaluation. For small centers (<83 transplants/PSR), there were 4-fold additional LP evaluations (52 vs 13 PSRs) for 1-year mortality with Bayesian versus new-CMS criteria. For large centers (>183 transplants/PSR), there were 3-fold additional LP evaluations for 1-year mortality with Bayesian versus new-CMS criteria with median differences in observed and expected patient survival of −1.6% and −2.2%, respectively. Conclusions: A significant proportion of kidney transplant centers are identified as low performing with relatively small survival differences compared with expected. Bayesian criteria have significantly higher flagging rates and new-CMS criteria modestly reduce flagging. Critical appraisal of performance criteria is needed to assess whether quality oversight is meeting intended goals and whether further modifications could reduce risk aversion, more efficiently allocate resources, and increase transplant opportunities.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Aycart, Mario A.; Alhefzi, Muayyad; Sharma, Gaurav; Krezdorn, Nicco; Bueno, Ericka M.; Talbot, Simon G.; Carty, Matthew J.; Tullius, Stefan G.; Pomahac, Bohdan
    Outcomes of Solid Organ Transplants After Simultaneous Solid Organ and Vascularized Composite Allograft Procurements: A Nationwide Analysis
    imageBackground: Current knowledge of the impact of facial vascularized composite allograft (VCA) procurement on the transplantation outcomes of the concomitantly recovered solid organs is limited to isolated case reports and short-term results. Here we report on a nationwide analysis of facial allograft donor surgery experience and long-term outcomes of the concomitantly recovered solid organs and their recipients. Methods: There were 10 facial VCA procurements in organ donors between December 2008 and October 2014. We identified the population of subjects who received solid organs from these 10 donors using the Scientific Registry of Transplant Recipients. We retrospectively reviewed operative characteristics, intraoperative parameters, and postoperative outcomes. Results: Six of 10 donor surgeries were performed at outside institutions, all on brain-dead donors. Mean operative duration for facial VCA recovery was 6.9 hours (range, 4-13.25 hours). A total of 36 solid organs were recovered and transplanted into 35 recipients. Survival rates for kidney and liver recipients were 100% and 90% at a median follow-up of 33 and 27.5 months, respectively (range, 6-72 months). Graft survival rates for kidneys and livers were 15 of 16 (94%) and 9 of 10 (90%), respectively. Recipient and graft survival rates for hearts and lungs were 75% (n = 4) and 100% (n = 3) at mean follow-up time of 14.75 and 16 months, respectively. A liver recipient died at 22 months from unknown causes and a heart recipient died of leukemia at 10 months. Conclusions: Facial VCA procurement does not appear to adversely affect the outcomes of transplant recipients of concomitantly recovered solid organ allografts.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Matthaei, Mario; Sandhaeger, Heike; Hermel, Martin; Adler, Werner; Jun, Albert S.; Cursiefen, Claus; Heindl, Ludwig M.
    Changing Indications in Penetrating Keratoplasty: A Systematic Review of 34 Years of Global Reporting
    imageBackground: Penetrating keratoplasty (PK) ranks among the oldest and most common kinds of human tissue transplantation. Based on the hypothesis that reported indications for PK significantly vary between global regions and over time, the present systematic review aimed to provide a thorough overview of global PK indications as reported in peer-reviewed manuscripts. Methods: A literature search of PubMed and MEDLINE was conducted to retrieve articles published from January 1980 to May 2014. Indications for PK within 7 global regions were compared using a modified classification system for PK indications and analyzed via multivariate regression. Results: A total of 141 publications from 37 countries were included, recording 180 865 PK cases. Postcataract surgery edema was the predominant indication in North America (28.0%) and ranked second in Europe (20.6%), Australia (21.1%), the Middle East (13.6%), Asia (15.5%), and South America (18.6%). Keratoconus was the leading indication in Europe (24.2%), Australia (33.2%), the Middle East (32.8%), Africa (32.4%), and South America (22.8%). It ranked third in North America (14.2%). Keratitis was the primary indication in Asia (32.3%). Fuchs endothelial corneal dystrophy was the fourth most common indication in North America (12.9%) and Europe (10.2%) and fifth in South America (3.8%). Multivariate analysis supported these results and revealed individual regional changes over time. Conclusions: Systematic analysis reveals characteristic chronological and regional differences in reported global PK indications. Leading reported indications for PK between 1980 and 2014 were keratoconus (Europe, Australia, the Middle East, Africa, and South America), pseudophakic bullous keratopathy/aphakic bullous keratopathy (North America), and keratitis (Asia).

    Date de mise en ligne : Jeudi 01 janvier 1970
    Crespo, Elena; Roedder, Silke; Sigdel, Tara; Hsieh, Szu-Chuan; Luque, Sergio; Cruzado, Josep Maria; Tran, Tim Q.; Grinyó, Josep Maria; Sarwal, Minnie M.; Bestard, Oriol
    Molecular and Functional Noninvasive Immune Monitoring in the ESCAPE Study for Prediction of Subclinical Renal Allograft Rejection
    imageBackground: Subclinical acute rejection (sc-AR) is a main cause for functional decline and kidney graft loss and may only be assessed through surveillance biopsies. Methods: The predictive capacity of 2 novel noninvasive blood biomarkers, the transcriptional kidney Solid Organ Response Test (kSORT), and the IFN-γ enzyme-linked immunosorbent spot assay (ELISPOT) assay were assessed in the Evaluation of Sub-Clinical Acute rejection PrEdiction (ESCAPE) Study in 75 consecutive kidney transplants who received 6-month protocol biopsies. Both assays were run individually and in combination to optimize the use of these techniques to predict sc-AR risk. Results: Subclinical acute rejection was observed in 22 (29.3%) patients (17 T cell–mediated subclinical rejection [sc-TCMR], 5 antibody-mediated subclinical rejection [sc-ABMR]), whereas 53 (70.7%) showed a noninjured, preserved (stable [STA]) parenchyma. High-risk (HR), low-risk, and indeterminate-risk kSORT scores were observed in 15 (20%), 50 (66.7%), and 10 (13.3%) patients, respectively. The ELISPOT assay was positive in 31 (41%) and negative in 44 (58.7%) patients. The kSORT assay showed high accuracy predicting sc-AR (specificity, 98%; positive predictive value 93%) (all sc-ABMR and 58% sc-TCMR showed HR-kSORT), whereas the ELISPOT showed high precision ruling out sc-TCMR (specificity = 70%, negative predictive value = 92.5%), but could not predict sc-ABMR, unlike kSORT. The predictive probabilities for sc-AR, sc-TCMR, and sc-ABMR were significantly higher when combining both biomarkers (area under the curve > 0.85, P < 0.001) and independently predicted the risk of 6-month sc-AR in a multivariate regression analysis. Conclusions: Combining a molecular and immune cell functional assay may help to identify HR patients for sc-AR, distinguishing between different driving alloimmune effector mechanisms.

    Date de mise en ligne : Jeudi 01 janvier 1970
    García-Carro, Clara; Dörje, Christina; Åsberg, Anders; Midtvedt, Karsten; Scott, Helge; Reinholt, Finn P.; Holdaas, Hallvard; Seron, Daniel; Reisæter, Anna V.
    Inflammation in Early Kidney Allograft Surveillance Biopsies With and Without Associated Tubulointerstitial Chronic Damage as a Predictor of Fibrosis Progression and Development of De Novo Donor Specific Antibodies
    imageBackground: Interstitial fibrosis and tubular atrophy (IFTA) associated with interstitial inflammation in nonscarred areas (IFTA+i) is associated with poorer graft outcome than inflammation without IFTA or IFTA without inflammation. Methods: We evaluated if histological categories at week 6 could predict the development of interstitial fibrosis and de novo donor specific anti-HLA antibodies (dnDSA) at 1 year. Biopsies were classified according to Banff criteria as normal (i+t≤1 and ci+ct≤1), inflammation (i+t≥2 and ci+ct≤1), IFTA (i+t≤1 and ci+ct≥2) or IFTA+i (i+t≥2 and ci+ct≥2). Results: We analyzed 598 standard immunological risk recipients. The histological diagnosis at 6 weeks was: normal (n = 206), inflammation (n = 29), IFTA (n = 255), and IFTA+i (n = 108). Moderate/severe interstitial fibrosis (ci≥2) at 1 year was observed in 4.2% of patients with prior (6 weeks) normal histology, in 3.4% with inflammation, in 13.8% with IFTA, and in 24.5% with IFTA+i (P = 0.0001). Fifty-three recipients (8.9%) had dnDSA at 1 year. Independent predictors of development of dnDSA at 1 year were: HLA-DR mismatches (odds ratio [OR], 1.95; 95% confidence interval [95% CI], 1.09-3.49), the presence of inflammation (OR, 5.49; 95% CI, 1.67-18.03) or IFTA+i (OR, 4.09; 95% CI, 1.67-10.0) in the 6-week surveillance biopsy. Conclusions: Early subclinical inflammation in surveillance biopsies with or without tubulointerstitial chronic lesions is associated with an increased risk of dnDSA development.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Masutani, Kosuke; Tsuchimoto, Akihiro; Kurihara, Kei; Okabe, Yasuhiro; Kitada, Hidehisa; Okumi, Masayoshi; Tanabe, Kazunari; Nakamura, Masafumi; Kitazono, Takanari; Tsuruya, Kazuhiko; The Japan Academic Consortium of Kidney Transplantation (JACK) investigators
    Histological Analysis in ABO-Compatible and ABO-Incompatible Kidney Transplantation by Performance of 3- and 12-Month Protocol Biopsies
    imageBackground: ABO-incompatible (ABO-I) kidney transplantation (KTx) is an established procedure to expand living donor sources. Although graft and patient survival rates are comparable between ABO-compatible (ABO-C) and ABO-I KTx, several studies have suggested that ABO-I KTx is associated with infection. Additionally, the histological findings and incidence of antibody-mediated rejection under desensitization with rituximab and plasmapheresis remain unclear. Methods: We reviewed 327 patients who underwent living-donor KTx without preformed donor-specific antibodies (ABO-C, n = 226; ABO-I, n = 101). Patients who underwent ABO-I KTx received 200 mg/body of rituximab and plasmapheresis, and protocol biopsy (PB) was planned at 3 and 12 months. We compared the PB findings, cumulative incidence of acute rejection in both PBs and indication biopsies, infection, and patient and graft survivals. Results: The 3- and 12-month PBs were performed in 85.0% and 79.2% of the patients, respectively. Subclinical acute rejection occurred in 6.9% and 9.9% of patients in the ABO-C and ABO-I groups at 3 months (P = 0.4) and in 12.4% and 10.1% at 12 months, respectively (P = 0.5). The cumulative incidence of acute rejection determined by both PBs and indication biopsies was 20.5% and 19.6%, respectively (P = 0.8). The degrees of microvascular inflammation and interstitial fibrosis/tubular atrophy were comparable. Polyomavirus BK nephropathy was found in 2.7% and 3.0% of patients in the ABO-C and ABO-I groups, respectively (P = 1.0). The incidence of other infections and the graft/patient survival rates were not different. Conclusions: Analyses using 3- and 12-month PBs suggested comparable allograft pathology between ABO-C and ABO-I KTx under desensitization with low-dose rituximab and plasmapheresis.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Ishihara, Hiroki; Ishida, Hideki; Unagami, Kohei; Hirai, Toshihito; Okumi, Masayoshi; Omoto, Kazuya; Shimizu, Tomokazu; Tanabe, Kazunari
    Evaluation of Microvascular Inflammation in ABO-Incompatible Kidney Transplantation
    imageBackground: In ABO-incompatible kidney transplantation, the diagnostic criteria for antibody-mediated rejection remain controversial because C4d deposition is commonly observed. Thus, we investigated microvascular inflammation (MVI score ≥ 2) within 1 year as a predictor of graft outcome. Methods: A total of 148 recipients without preformed or de novo donor-specific anti-HLA antibody were stratified based on MVI score less than 2 (n = 117) and MVI score of 2 or greater (n = 31). Results: We found that 5-year graft survival was significantly lower (P = 0.0129) in patients with MVI (89.8%) than in patients without MVI (97.0%). Graft function, as characterized by serum estimated glomerular filtration rate, was also significantly worse for patients with MVI than it was for patients without MVI, between 3 months and 10 years after transplantation (P = 0.048). Multivariate analysis indicated that HLA class II mismatch (P = 0.0085) was an independent marker of MVI. Conclusions: Microvascular inflammation score of 2 or greater is significantly associated with poor graft outcome after ABO-incompatible kidney transplantation. We suggest that MVI score of 2 or greater in ABOi transplantation be used as a basis to diagnose antibody-mediated rejection.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Chow, Kevin V.; Flint, Shaun M.; Shen, Angeline; Landgren, Anthony; Finlay, Moira; Murugasu, Anand; Masterson, Rosemary; Hughes, Peter; Cohney, Solomon J.
    Histological and Extended Clinical Outcomes After ABO-Incompatible Renal Transplantation Without Splenectomy or Rituximab
    imageBackground: Excellent short-term results have been reported in ABO-incompatible (ABOi) renal transplant recipients managed solely with antibody removal and conventional immunosuppression. However, long-term clinical outcomes with this regimen and predictive information from protocol biopsies are lacking. Methods: We compared outcome data in ABOi and ABO-compatible (ABOc) recipients receiving this regimen approximately 4 years posttransplant, and histology from biopsies approximately 12 months posttransplant. Results: Patient and graft survivals among 54 ABOi recipients were 98.1% and 90.7%, respectively, at 4 years. Graft function was similar between ABOi (creatinine, 140.3 μmol/L) and ABOc recipients (creatinine, 140.2 μmol/L) (P = 0.99), with no significant change over the study period in either group (Δcreatinine, −0.83 vs 6.6 μmol/L) (P = 0.59). There was no transplant glomerulopathy in biopsies from either group. Interstitial fibrosis (IF) and tubular atrophy (TA) was present in 7 (28%) of 25 ABOi compared with 7 (20.6%) of 34 ABOc (P = 0.52). Progression of IF/TA from implantation was noted in 6 (24%) of 25 ABOi and 6 (17.6%) of 34 ABOc, respectively. C4d staining without antibody-mediated rejection was present in 13 (52%) 25 early posttransplant biopsies from ABOi recipients by immunohistochemistry, but in only 4 (16%) of 25 at 12 months. Conclusions: ABO-incompatible renal transplant performed with antibody removal and conventional immunosuppression continues to provide excellent patient and graft survival, and stable renal function over 4 years. Coupled with absent transplant glomerulopathy and low rates of progressive IF/TA on earlier biopsies, this suggests that ABOi with conventional immunosuppression and antibody removal, without rituximab or splenectomy, can achieve long-term outcomes comparable to ABO-compatible transplantation.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Alves, Fábio de Abreu; Gale, Gita; Vivas, Ana Paula, Molina; Porta, Gilda; Costa, Felipe D`Almeida; Warfwinge, Gunnar; Jontell, Mats; Saalman, Robert
    Immunohistopathology of the Newly Discovered Giant Papillae Tongue Disorder in Organ-Transplanted Children
    imageBackground: Giant papillae tongue disorder (GPTD) is a newly discovered, long-lasting clinical disorder that may develop in organ-transplanted pediatric recipients. The key feature of this disorder is the unique tongue lesion, which comprises swollen fungiform papillae. The aim of this study was to characterize the immunohistopathology of this novel inflammatory condition. Methods: Six organ transplanted children with GPTD were included in the study. Routine histopathology and immunohistochemical stainings for CD3, CD4, CD8, CD25, FOXP3, CD20, CD138, CD68, CD1a, CD15, CD23, and mast cell tryptase were performed. Results: Immunohistochemical analyses of the oral lesions revealed a subepithelial infiltrate that was primarily composed of CD3- and CD4-positive T cells, CD20-expressing B cells, macrophages, and CD138-positive plasma cells. The CD20-positive cells did not display the typical B cell morphology, having in general a more dendritic cell-like appearance. The CD138-expressing plasma cells were distinctly localized as a dense infiltrate beneath the accumulation of T cells and B cells. Increased numbers of CD1a-expressing Langerhans cells were detected both in the epithelium and connective tissue. Because no granulomas were observed and only single lesional eosinophils were detected, GPTD does not resemble a granulomatous or eosinophilic condition. Conclusions: We describe for the first time the immunopathological characteristics of a novel inflammatory disorder of the oral cavity, which may develop after solid organ transplantation in children.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Natori, Yoichiro; Humar, Atul; Husain, Shahid; Rotstein, Coleman; Renner, Eberhard; Singer, Lianne; Kim, S. Joseph; Kumar, Deepali
    Recurrence of CMV Infection and the Effect of Prolonged Antivirals in Organ Transplant Recipients
    imageBackground: Although initial therapy for cytomegalovirus (CMV) is usually successful, a significant subset of patients may have recurrent viremia. However, the epidemiology and risk factors for recurrence have not been fully defined, as well as the utility of prolonged antivirals after initial clearance. Methods: Solid organ transplant patients with first episode of CMV disease or asymptomatic viremia (≥1000 IU/mL) requiring treatment were identified by chart review. Clinical and virologic data were collected. The primary outcome was recurrence of CMV viremia or disease within 6 months of treatment discontinuation. Results: The first episode of CMV viremia requiring antiviral therapy was assessed in 282 patients (147 CMV disease and 135 asymptomatic viremia). Cytomegalovirus occurred at 5.6 (0.63-27.7) months posttransplant. Recurrent CMV occurred in 30.5% patients at a median of 51 (0-160) days after discontinuation of therapy. Factors predictive of recurrence were treatment phase viral kinetics (P = 0.005), lung transplant (P = 0.002), CMV donor (D)+/recipient (R)− serostatus(P = 0.04) and recent acute rejection(P = 0.02). Prolonged antiviral therapy was given to 226 (80.1%) of 282 patients. Recurrence occurred in 73 (32.3%) of 226 patients that received prolonged antivirals versus 13 (23.2%) of 56 in those with no prolonged antivirals (P = 0.19). Conclusions: Recurrent CMV occurs in a significant percentage of patients after treatment of the first episode of CMV viremia/disease. CMV D+/R− serostatus, lung transplant, and treatment phase viral kinetics were significant predictors of recurrence. Continuation of prolonged antivirals beyond initial clearance was not associated with a reduced risk of recurrence.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Bonani, Marco; Pereira, Rahja M.; Misselwitz, Benjamin; Fehr, Thomas; Wüthrich, Rudolf P.; Franzen, Daniel
    Chronic Norovirus Infection as a Risk Factor for Secondary Lactose Maldigestion in Renal Transplant Recipients: A Prospective Parallel Cohort Pilot Study
    imageBackground: Chronic norovirus infection is an emerging challenge in the immunocompromised host, in whom it may be asymptomatic or present as chronic diarrhea. The mechanisms of diarrhea in chronic norovirus infection are not well understood, but in analogy to Gardia lamblia and rotavirus infections, secondary lactose maldigestion (LM) might be implicated. Methods: Adult renal transplant recipients who had symptomatic chronic norovirus infection with diarrhea were asked to participate in this prospective parallel cohort study. Renal transplant recipients with otherwise unexplainable chronic diarrhea but absent infection served as control group. In both groups, a lactose hydrogen breath test and a lactose tolerance test were performed after exclusion of primary LM by a negative lactase gene test. Results: Of approximately 800 patients in the cohort of renal transplant recipients at our institution, 15 subjects were included in the present study. Of these, 7 had chronic symptomatic norovirus infection with diarrhea (noro group), and 8 had diarrhea in the absence of norovirus (control group). Lactose hydrogen breath test and lactose tolerance test were positive in all 7 patients (100%) in the noro group, whereas only 1 (12.5%) of 8 patients in the control group had a positive test. Thus, secondary LM was highly prevalent in the noro compared with the control group with an odds ratio of 75.0 (95% confidence interval, 2.6-2153, P = 0.01). Conclusions: This is the first report showing a positive association of chronic norovirus infection and secondary LM. Further studies with larger patient numbers and longer follow-up are needed to test a causative relationship between both entities.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Bialasiewicz, Seweryn; Hart, Gareth; Oliver, Kimberly; Agnihotri, Shruti P.; Koralnik, Igor J.; Viscidi, Raphael; Nissen, Michael D.; Sloots, Theo P.; Burke, Michael T.; Isbel, Nicole M.; Burke, John
    A Difficult Decision: Atypical JC Polyomavirus Encephalopathy in a Kidney Transplant Recipient
    imageBackground: A number of cerebral manifestations are associated with JC polyomavirus (JCPyV) which are diagnosed by detection of JCPyV in cerebrospinal fluid (CSF), often with the support of cerebral imaging. Here we present an unusual case of a kidney transplant patient presenting with progressive neurological deterioration attributed to JCPyV encephalopathy. Methods: Quantitative polymerase chain reaction JCPyV was used prospectively and retrospectively to track the viral load within the patient blood, urine, CSF, and kidney sections. A JCPyV VP1 enzyme-linked immunosorbent assay was used to measure patient and donor antibody titers. Immunohistochemical staining was used to identify active JCPyV infection within the kidney allograft. Results: JC polyomavirus was detected in the CSF at the time of presentation. JC polyomavirus was not detected in pretransplant serum, however viral loads increased with time, peaking during the height of the neurological symptoms (1.5E9 copies/mL). No parenchymal brain lesions were evident on imaging, but transient cerebral venous sinus thrombosis was present. Progressive decline in neurological function necessitated immunotherapy cessation and allograft removal, which led to decreasing serum viral loads and resolution of neurological symptoms. JC polyomavirus was detected within the graft's collecting duct cells using quantitative polymerase chain reaction and immunohistochemical staining. The patient was JCPyV naive pretransplant, but showed high antibody titers during the neurological symptoms, with the IgM decrease paralleling the viral load after graft removal. Conclusions: We report a case of atypical JCPyV encephalopathy associated with cerebral venous sinus thrombosis and disseminated primary JCPyV infection originating from the kidney allograft. Clinical improvement followed removal of the allograft and cessation of immunosuppression.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Simkins, Jacques; Abbo, Lilian Margarita; Camargo, Jose Fernando; Rosa, Rossana; Morris, Michele Ileana
    Twelve-Week Rifapentine Plus Isoniazid Versus 9-Month Isoniazid for the Treatment of Latent Tuberculosis in Renal Transplant Candidates
    imageBackground: Renal transplant candidates (RTC) with latent tuberculosis infection (LTBI) are at significant risk for tuberculosis reactivation. Twelve-week rifapentine (RPT)/isoniazid (INH) is effective for LTBI but clinical experience in RTC is scarce. Methods: We conducted a retrospective study of RTC with LTBI treated with either 12-week RPT/INH or 9-month INH from March 1, 2012, through February 28, 2014. We evaluated both groups for differences in rates of treatment completion, monthly follow-up visit compliance, transaminase elevations, and adverse reactions leading to discontinuation of LTBI treatment. The utility of weekly reminders was also evaluated in the 12-week regimen. Direct observed therapy was not performed in our study. Results: Of 153 patients, 43 (28%) and 110 (72%) were started on 12-week RPT/INH and 9-month INH, respectively. The treatment completion and monthly follow-up visit compliance rates were higher in the 12-week RPT/INH group (40 [93%] vs 52 [47%], P < 0.001) and (11/40 [28%] vs 13/104 [13%], P = 0.03), respectively. Transaminase elevations were not observed in the RPT/INH group, but occurred in 6 (5%) of the INH group. There were no differences in adverse reactions leading to discontinuation of LTBI treatment. Conclusions: Twelve-week RPT/INH appears to be an excellent choice for LTBI in RTC. It has a higher treatment completion rate and causes less transaminase elevations, and weekly reminders may be an alternative when direct observed therapy is not feasible.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Bamoulid, Jamal; Courivaud, Cécile; Coaquette, Alain; Crépin, Thomas; Carron, Clémence; Gaiffe, Emilie; Roubiou, Caroline; Rebibou, Jean-Michel; Ducloux, Didier
    Late Persistent Positive EBV Viral Load and Risk of Solid Cancer in Kidney Transplant Patients
    imageBackground: Recent studies reported that posttransplant Epstein-Barr virus (EBV) replication is frequent and indicates overimmunosuppression. We hypothesized that long-term EBV replication may identify overimmunosuppressed patients at higher risk of cancer. Methods: We analyzed a prospective cohort of renal transplant recipients having routine EBV PCR surveillance. All cancers (except EBV-related neoplasia) were recorded. Results: Mean follow up was 94 + 23 months. Samples (8412) were available in 669 patients. Three hundred eighty-eight of the 669 patients (58%) had at least 1 positive viremia during follow-up. Epstein-Barr virus D+/R− patients (P = 0.046) as well as those having received antithymocyte globulin (P < 0.001) were more likely to develop persistent EBV viremia. Eighty-six patients (12.9%) developed a cancer during follow-up. The cumulated incidence of cancer was higher in patients with persistent high EBV replication (22.4% vs 10.2%, P = 0.005). The effect of persistent EBV infection remained significant even after adjustment for all confounding factors (hazard ratio, 1.69; 95% confidence interval, 1.10-2.61; P = 0.018). Age, history of antithymocyte globulin use, smoking, and history of cancer were also associated with cancer occurrence. Conclusions: Persistent high EBV viral load is associated with the occurrence of solid cancer. In this setting, more intensive screening and/or minimization of immunosuppressive treatment are probably required.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Schaenman, Joanna M.; Korin, Yael; Sidwell, Tiffany; Kandarian, Fadi; Harre, Nicholas; Gjertson, David; Lum, Erik L.; Reddy, Uttam; Huang, Edmund; Pham, Phuong T.; Bunnapradist, Suphamai; Danovitch, Gabriel M.; Veale, Jefferey; Gritsch, H. Albin; Reed, Elaine F.
    Increased Frequency of BK Virus-Specific Polyfunctional CD8+ T Cells Predict Successful Control of BK Viremia After Kidney Transplantation
    imageBackground: BK virus infection remains an important cause of loss of allograft function after kidney transplantation. We sought to determine whether polyfunctional T cells secreting multiple cytokines simultaneously, which have been shown to be associated with viral control, could be detected early after start of BK viremia, which would provide insight into the mechanism of successful antiviral control. Methods: Peripheral blood mononuclear cells collected during episodes of BK viral replication were evaluated by multiparameter flow cytometry after stimulation by overlapping peptide pools of BK virus antigen to determine frequency of CD8+ and CD4+ T cells expressing 1 or more cytokines simultaneously, as well as markers of T-cell activation, exhaustion, and maturation. Results: BK virus controllers, defined as those with episodes of BK viremia of 3 months or less, had an 11-fold increase in frequency of CD8+ polyfunctional T cells expressing multiple cytokines, as compared with patients with prolonged episodes of BK viremia. Patients with only low level BK viremia expressed low frequencies of polyfunctional T cells. Polyfunctional T cells were predominantly of the effector memory maturation subtype and expressed the cytotoxicity marker CD107a. Conclusions: Noninvasive techniques for immune assessment of peripheral blood can provide insight into the mechanism of control of BK virus replication and may allow for future patient risk stratification and customization of immune suppression at the onset of BK viremia.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Liu, Sandy; Chaudhry, Muhammad R.; Berrebi, Alexander A.; Papadimitriou, John C.; Drachenberg, Cinthia B.; Haririan, Abdolreza; Alexiev, Borislav A.
    Polyomavirus Replication and Smoking Are Independent Risk Factors for Bladder Cancer After Renal Transplantation
    imageBackground: Solid organ transplant recipients are at increased risk for developing malignancies. Polyomaviruses (PV) have been historically associated with experimental tumor development and recently described in association with renourinary malignancies in transplant patients. The aim of this study was to investigate the relationship between PV replication and smoking, and the development of malignant neoplasms in kidney transplant recipients. Methods: A retrospective case-control study was conducted for PV replication in all kidney biopsies and urine cytologies performed between 1998 and 2014 from kidney transplant recipients at the University of Maryland Medical Center. Polyomavirus-positive patients (n = 943) were defined as having any of the following: a kidney biopsy with PV associated nephropathy, any urine cytology demonstrating “decoy” cells, and/or significant polyomavirus BK viremia. Polyomavirus-negative matched patients (n = 943) were defined as lacking any evidence of PV replication. The incidence of malignancy (excluding nonmelanoma skin tumors) was determined in these 1886 patients and correlated with demographic data and history of smoking. Results: There was a 7.9% incidence of malignant tumors after a mean posttransplant follow-up of 7.9 ± 5.4 years. Among all cancer subtypes, only bladder carcinoma was significantly associated with PV replication. By multivariate analysis, only PV replication and smoking independently increased the risk of bladder cancer, relative risk, 11.7 (P = 0.0013) and 5.6 (P = 0.0053), respectively. Conclusions: The findings in the current study indicate that kidney transplant recipients with PV replication and smoking are at particular risk to develop bladder carcinomas and support the need for long-term cancer surveillance in these patients.

    Date de mise en ligne : Jeudi 01 janvier 1970
    Abend, Johanna R.; Changala, Marguerite; Sathe, Atul; Casey, Fergal; Kistler, Amy; Chandran, Sindhu; Howard, Abigail; Wojciechowski, David
    Correlation of BK Virus Neutralizing Serostatus With the Incidence of BK Viremia in Kidney Transplant Recipients
    imageBackground: BK virus (BKV)-associated nephropathy is the second leading cause of graft loss in kidney transplant recipients. Due to the high prevalence of persistent infection with BKV in the general population, it is possible that either the transplant recipient or donor may act as the source of virus resulting in viruria and viremia. Although several studies suggest a correlation between donor-recipient serostatus and the development of BK viremia, specific risk factors for BKV-related complications in the transplant setting remain to be established. Methods: We retrospectively determined the pretransplant BKV neutralizing serostatus of 116 donors (D)-recipient (R) pairs using infectious BKV neutralization assays with representatives from the 4 major viral serotypes. The neutralizing serostatus of donors and recipients was then correlated with the incidence of BK viremia during the first year posttransplantation. Results: There were no significant differences in baseline demographics or transplant data among the 4 neutralizing serostatus groups, with the exception of calculated panel-reactive antibody which was lowest in the D+/R− group. Recipients of kidneys from donors with significant serum neutralizing activity (D+) had elevated risk for BK viremia, regardless of recipient serostatus (D+ versus D−: odd ratio, 5.0; 95% confidence interval, 1.9-12.7]; P = 0.0008). Furthermore, donor-recipient pairs with D+/R− neutralizing serostatus had the greatest risk for BK viremia (odds ratio, 4.9; 95% confidence interval, 1.7-14.6; P = 0.004). Conclusions: Donor neutralizing serostatus correlates significantly with incidence of posttransplant BK viremia. Determination of donor-recipient neutralizing serostatus may be useful in assessing the risk of BKV infection in kidney transplant recipients.