The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
Dysregulated iron transport and a compromised blood-brain barrier are implicated in HIV-associated neurocognitive disorders (HAND). We quantified the levels of proteins involved in iron transport and/or angiogenesis-ceruloplasmin, haptoglobin, and vascular endothelial growth factor (VEGF)-as well as biomarkers of neuroinflammation, in cerebrospinal fluid (CSF) from 405 individuals with HIV infection and comprehensive neuropsychiatric assessments. Associations with HAND [defined by a Global Deficit Score (GDS) ≥ 0.5, GDS as a continuous measure (cGDS), or by Frascati criteria] were evaluated for the highest versus lowest tertile of each biomarker, adjusting for potential confounders. Higher CSF VEGF was associated with GDS-defined impairment [odds ratio (OR) 2.17, p = 0.006] and cGDS in unadjusted analyses and remained associated with GDS impairment after adjustment (p = 0.018). GDS impairment was also associated with higher CSF ceruloplasmin (p = 0.047) and with higher ceruloplasmin and haptoglobin in persons with minimal comorbidities (ORs 2.37 and 2.13, respectively; both p = 0.043). In persons with minimal comorbidities, higher ceruloplasmin and haptoglobin were associated with HAND by Frascati criteria (both p < 0.05), and higher ceruloplasmin predicted worse impairment (higher cGDS values, p < 0.01). In the subgroup with undetectable viral load and minimal comorbidity, CSF ceruloplasmin and haptoglobin were strongly associated with GDS impairment (ORs 5.57 and 2.96, respectively; both p < 0.01) and HAND (both p < 0.01). Concurrently measured CSF IL-6 and TNF-α were only weakly correlated to these three biomarkers. Higher CSF ceruloplasmin, haptoglobin, and VEGF are associated with a significantly greater likelihood of HAND, suggesting that interventions aimed at disordered iron transport and angiogenesis may be beneficial in this disorder.
RATIONALE - Intensive care unit (ICU) delirium is highly prevalent and a potentially avoidable hospital complication. The current cost of ICU delirium is unknown.
OBJECTIVES - To specify the association between the daily occurrence of delirium in the ICU with costs of ICU care accounting for time-varying illness severity and death.
RESEARCH DESIGN - We performed a prospective cohort study within medical and surgical ICUs in a large academic medical center.
SUBJECTS - We analyzed critically ill patients (N=479) with respiratory failure and/or shock.
MEASURES - Covariates included baseline factors (age, insurance, cognitive impairment, comorbidities, Acute Physiology and Chronic Health Evaluation II Score) and time-varying factors (sequential organ failure assessment score, mechanical ventilation, and severe sepsis). The primary analysis used a novel 3-stage regression method: first, estimation of the cumulative cost of delirium over 30 ICU days and then costs separated into those attributable to increased resource utilization among survivors and those that were avoided on the account of delirium's association with early mortality in the ICU.
RESULTS - The patient-level 30-day cumulative cost of ICU delirium attributable to increased resource utilization was $17,838 (95% confidence interval, $11,132-$23,497). A combination of professional, dialysis, and bed costs accounted for the largest percentage of the incremental costs associated with ICU delirium. The 30-day cumulative incremental costs of ICU delirium that were avoided due to delirium-associated early mortality was $4654 (95% confidence interval, $2056-7869).
CONCLUSIONS - Delirium is associated with substantial costs after accounting for time-varying illness severity and could be 20% higher (∼$22,500) if not for its association with early ICU mortality.
RATIONALE & OBJECTIVE - Inflammation, cardiac remodeling, and fibrosis may explain in part the excess risk for cardiovascular disease (CVD) in patients with chronic kidney disease (CKD). Growth differentiation factor 15 (GDF-15), galectin 3 (Gal-3), and soluble ST2 (sST2) are possible biomarkers of these pathways in patients with CKD.
STUDY DESIGN - Observational cohort study.
SETTING & PARTICIPANTS - Individuals with CKD enrolled in either of 2 multicenter CKD cohort studies: the Seattle Kidney Study or C-PROBE (Clinical Phenotyping and Resource Biobank Study).
EXPOSURES - Circulating GDF-15, Gal-3, and sST2 measured at baseline.
OUTCOMES - Primary outcome was all-cause mortality. Secondary outcomes included hospitalization for physician-adjudicated heart failure and the atherosclerotic CVD events of myocardial infarction and cerebrovascular accident.
ANALYTIC APPROACH - Cox proportional hazards models used to test the association of each biomarker with each outcome, adjusting for demographics, CVD risk factors, and kidney function.
RESULTS - Among 883 participants, mean estimated glomerular filtration rate was 49±19mL/min/1.73m. Higher GDF-15 (adjusted HR [aHR] per 1-SD higher, 1.87; 95% CI, 1.53-2.29), Gal-3 (aHR per 1-SD higher, 1.51; 95% CI, 1.36-1.78), and sST2 (aHR per 1-SD higher, 1.36; 95% CI, 1.17-1.58) concentrations were significantly associated with mortality. Only GDF-15 level was also associated with heart failure events (HR per 1-SD higher, 1.56; 95% CI, 1.12-2.16). There were no detectable associations between GDF-15, Gal-3, or sST2 concentrations and atherosclerotic CVD events.
LIMITATIONS - Event rates for heart failure and atherosclerotic CVD were low.
CONCLUSIONS - Adults with CKD and higher circulating GDF-15, Gal-3, and sST2 concentrations experienced greater mortality. Elevated GDF-15 concentration was also associated with an increased rate of heart failure. Further work is needed to elucidate the mechanisms linking these circulating biomarkers with CVD in patients with CKD.
Copyright © 2018 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
BACKGROUND - Late-life depression (LLD) is associated with a fragile antidepressant response and high recurrence risk. This study examined what measures predict recurrence in remitted LLD.
METHODS - Individuals of age 60 years or older with a Diagnostic and Statistical Manual - IV (DSM-IV) diagnosis of major depressive disorder were enrolled in the neurocognitive outcomes of depression in the elderly study. Participants received manualized antidepressant treatment and were followed longitudinally for an average of 5 years. Study analyses included participants who remitted. Measures included demographic and clinical measures, medical comorbidity, disability, life stress, social support, and neuropsychological testing. A subset underwent structural magnetic resonance imaging (MRI).
RESULTS - Of 241 remitted elders, approximately over 4 years, 137 (56.8%) experienced recurrence and 104 (43.2%) maintained remission. In the final model, greater recurrence risk was associated with female sex (hazard ratio [HR] = 1.536; confidence interval [CI] = 1.027-2.297), younger age of onset (HR = 0.990; CI = 0.981-0.999), higher perceived stress (HR = 1.121; CI = 1.022-1.229), disability (HR = 1.060; CI = 1.005-1.119), and less support with activities (HR = 0.885; CI = 0.812-0.963). Recurrence risk was also associated with higher Montgomery-Asberg Depression Rating Scale (MADRS) scores prior to censoring (HR = 1.081; CI = 1.033-1.131) and baseline symptoms of suicidal thoughts by MADRS (HR = 1.175; CI = 1.002-1.377) and sadness by Center for Epidemiologic Studies-Depression (HR = 1.302; CI, 1.080-1.569). Sex, age of onset, and suicidal thoughts were no longer associated with recurrence in a model incorporating report of multiple prior episodes (HR = 2.107; CI = 1.252-3.548). Neither neuropsychological test performance nor MRI measures of aging pathology were associated with recurrence.
CONCLUSIONS - Over half of the depressed elders who remitted experienced recurrence, mostly within 2 years. Multiple clinical and environmental measures predict recurrence risk. Work is needed to develop instruments that stratify risk.
© 2018 Wiley Periodicals, Inc.
OBJECTIVE - The traditional fee-for-service approach to healthcare can lead to the management of a patient's conditions in a siloed manner, inducing various negative consequences. It has been recognized that a bundled approach to healthcare - one that manages a collection of health conditions together - may enable greater efficacy and cost savings. However, it is not always evident which sets of conditions should be managed in a bundled manner. In this study, we investigate if a data-driven approach can automatically learn potential bundles.
METHODS - We designed a framework to infer health condition collections (HCCs) based on the similarity of their clinical workflows, according to electronic medical record (EMR) utilization. We evaluated the framework with data from over 16,500 inpatient stays from Northwestern Memorial Hospital in Chicago, Illinois. The plausibility of the inferred HCCs for bundled care was assessed through an online survey of a panel of five experts, whose responses were analyzed via an analysis of variance (ANOVA) at a 95% confidence level. We further assessed the face validity of the HCCs using evidence in the published literature.
RESULTS - The framework inferred four HCCs, indicative of (1) fetal abnormalities, (2) late pregnancies, (3) prostate problems, and (4) chronic diseases, with congestive heart failure featuring prominently. Each HCC was substantiated with evidence in the literature and was deemed plausible for bundled care by the experts at a statistically significant level.
CONCLUSIONS - The findings suggest that an automated EMR data-driven framework conducted can provide a basis for discovering bundled care opportunities. Still, translating such findings into actual care management will require further refinement, implementation, and evaluation.
Copyright © 2017 Elsevier Inc. All rights reserved.
RATIONALE - The epidemiology and prognostic impact of increased pulmonary pressure among HIV-infected individuals in the antiretroviral therapy era is not well described.
OBJECTIVES - To examine the prevalence, clinical features, and outcomes of increased echocardiographic pulmonary pressure in HIV-infected and -uninfected individuals.
METHODS - This study evaluated 8,296 veterans referred for echocardiography with reported pulmonary artery systolic pressure (PASP) estimates from the Veterans Aging Cohort study, an observational cohort of HIV-infected and -uninfected veterans matched by age, sex, race/ethnicity, and clinical site. The primary outcome was adjusted mortality by HIV status.
MEASUREMENTS AND MAIN RESULTS - PASP was reported in 2,831 HIV-infected and 5,465 HIV-uninfected veterans (follow-up [mean ± SD], 3.8 ± 2.6 yr). As compared with uninfected veterans, HIV-infected veterans with HIV viral load greater than 500 copies/ml (odds ratio, 1.27; 95% confidence interval [CI], 1.05-1.54) and those with CD4 cell count less than 200 cells/μl (odds ratio, 1.28; 95% CI, 1.02-1.60) had a higher prevalence of PASP greater than or equal to 40 mm Hg. As compared with uninfected veterans with a PASP less than 40 mm Hg, HIV-infected veterans with a PASP greater than or equal to 40 mm Hg had an increased risk of death (adjusted hazard ratio, 1.78; 95% CI, 1.57-2.01). This risk persisted even among participants without prevalent comorbidities (adjusted hazard ratio, 3.61; 95% CI, 2.17-6.01). The adjusted risk of mortality in HIV-infected veterans was higher at all PASP values than in uninfected veterans, including at values currently considered to be normal.
CONCLUSIONS - HIV-infected people with high HIV viral loads or low CD4 cell counts have a higher prevalence of increased PASP than uninfected people. Mortality risk in HIV-infected veterans increases at lower values of PASP than previously recognized and is present even among those without prevalent comorbidities. These findings may inform clinical decision-making regarding screening and surveillance of pulmonary hypertension in HIV-infected individuals.
AIMS - Psychosocial factors amplify symptoms of Interstitial Cystitis (IC/BPS). While psychosocial self-management is efficacious in other pain conditions, its impact on an IC/BPS population has rarely been studied. The objective of this review is to learn the prevalence and impact of psychosocial factors on IC/BPS, assess baseline psychosocial characteristics, and offer recommendations for assessment and treatment.
METHOD - Following PRISMA guidelines, primary information sources were PubMed including MEDLINE, Embase, CINAHL, and GoogleScholar. Inclusion criteria included: (i) a clearly defined cohort with IC/BPS or with Chronic Pelvic Pain Syndrome provided the IC/BPS cohort was delineated with quantitative results from the main cohort; (ii) all genders and regions; (iii) studies written in English from 1995 to April 14, 2017; (iv) quantitative report of psychosocial factors as outcome measures or at minimum as baseline characteristics.
RESULTS - Thirty-four of an initial 642 articles were reviewed. Quantitative analyses demonstrate the magnitude of psychosocial difficulties in IC/BPS, which are worse than average on all measures, and fall into areas of clinical concern for 7 out of 10 measures. Meta-analyses shows mean Mental Component Score of the Short-Form 12 Health Survey (MCS) of 40.80 (SD 6.25, N = 2912), where <36 is consistent with severe psychological impairment. Averaged across studies, the population scored in the range seen in clinical depression (CES-D 19.89, SD 13.12, N = 564) and generalized anxiety disorder (HADS-A 8.15, SD 4.85, N = 465).
CONCLUSION - The psychological impact of IC/BPS is pervasive and severe. Existing evidence of treatment is lacking and suggests self-management intervention may be helpful.
© 2017 Wiley Periodicals, Inc.
Acute kidney injury (AKI) is associated with subsequent chronic kidney disease (CKD), but the mechanism is unclear. To clarify this, we examined the association of AKI and new-onset or worsening proteinuria during the 12 months following hospitalization in a national retrospective cohort of United States Veterans hospitalized between 2004-2012. Patients with and without AKI were matched using baseline demographics, comorbidities, proteinuria, estimated glomerular filtration rate, blood pressure, angiotensin-converting enzyme inhibitor or angiotensin II receptor blocker (ACEI/ARB) use, and inpatient exposures linked to AKI. The distribution of proteinuria over one year post-discharge in the matched cohort was compared using inverse probability sampling weights. Subgroup analyses were based on diabetes, pre-admission ACEI/ARB use, and AKI severity. Among the 90,614 matched AKI and non-AKI pairs, the median estimated glomerular filtration rate was 62 mL/min/1.73m. The prevalence of diabetes and hypertension were 48% and 78%, respectively. The odds of having one plus or greater dipstick proteinuria was significantly higher during each month of follow-up in patients with AKI than in patients without AKI (odds ratio range 1.20-1.39). Odds were higher in patients with Stage II or III AKI (odds ratios 1.32-1.81) than in Stage I AKI (odds ratios 1.18-1.32), using non-AKI as the reference group. Results were consistent regardless of diabetes status or baseline ACEI/ARB use. Thus, AKI is a risk factor for incident or worsening proteinuria, suggesting a possible mechanism linking AKI and future CKD. The type of proteinuria, physiology, and clinical significance warrant further study as a potentially modifiable risk factor in the pathway from AKI to CKD.
Published by Elsevier Inc.
BACKGROUND - Perforation during colonoscopy is a rare but well recognized complication with significant morbidity and mortality. We aim to systematically review the currently available literature concerning care and outcomes of colonic perforation. An algorithm is created to guide the practitioner in management of this challenging clinical scenario.
DATA SOURCES - A systematic review of the literature based on PRISMA-P guidelines was performed. We evaluate 31 articles focusing on findings over the past 10 years.
CONCLUSION - Colonoscopic perforation is a rare event and published management techniques are marked by their heterogeneity. Reliable conclusions are limited by the nature of the data available - mainly single institution, retrospective studies. Consensus conclusions include a higher rate of perforation from therapeutic colonoscopy when compared to diagnostic colonoscopy and the sigmoid as the most common site of perforation. Mortality appears driven by pre-existing conditions. Treatment must be tailored according to the patient's comorbidities and clinical status as well as the specific conditions during the colonoscopy that led to the perforation.
Copyright © 2017 Elsevier Inc. All rights reserved.