The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
BACKGROUND - Mucus cytokines have been linked to baseline metrics of quality of life and olfactory function in patients with chronic rhinosinusitis (CRS). However, their potential utility in predicting postoperative outcomes has not been assessed. Therefore, in this study we evaluated the role of mucus cytokines in predicting 22-item Sino-Nasal Outcomes Test (SNOT-22) scores after endoscopic sinus surgery (ESS) in a prospective cohort of CRS patients.
METHODS - One hundred forty-seven patients with CRS electing surgical therapy were enrolled in a longitudinal cohort study. Mucus was collected intraoperatively from the middle meatus and tested for interleukin (IL)-1β, IL-2, -4, -5, -6, -7,- 8, -9, -10, -12, -13, -17A, and -21; tumor necrosis factor (TNF)-α; interferon-γ; eotaxin; and RANTES (regulated-on-activation, normal T-cell expressed and secreted) expression using a multiplex flow-cytometric bead assay. Sixty-two patients were followed postoperatively (average, 10.2 months) with baseline and follow-up SNOT-22 surveys. Stepwise multivariate linear regression was used to model relationships between baseline cytokines, phenotype, and average postoperative SNOT-22 total and domain scores. A machine learning approach using a random forest algorithm was also used to investigate potential nonlinear relationships.
RESULTS - IL-5 was an independent predictor of postoperative total SNOT-22 improvement (β = -8.8, p < 0.0001), whereas IL-2 levels predicted postoperative worsening (β = 6.97, p = 0.0015). Similar relationships were also seen for postoperative SNOT-22 domain scores. The overall model was also noted to be significant fit for the data (adjusted R = 0.398, p < 0.0001). The random forest model similarly identified IL-5, TNF-α, IL-13, and IL-2 as major predictors of postoperative SNOT-22 scores.
CONCLUSION - Mucus cytokine profiles may help identify CRS patients who are likely to obtain postoperative improvement after ESS.
© 2019 ARS-AAOA, LLC.
Greater understanding of the determinants of skeletal fragility is highly sought due to the great burden that bone affecting diseases and fractures have on economies, societies and health care systems. Being a complex, hierarchical composite of collagen type-I and non-stoichiometric substituted hydroxyapatite, bone derives toughness from its organic phase. In this study, we tested whether early observations that a strong correlation between bone collagen integrity measured by thermomechanical methods and work to fracture exist in a more general and heterogeneous sampling of the population. Neighboring uniform specimens from an established, highly characterized and previously published collection of human cortical bone samples (femur mid-shaft) were decalcified in EDTA. Fifty-four of the original 62 donors were included (26 male and 28 females; ages 21-101 years; aging, osteoporosis, diabetes and cancer). Following decalcification, bone collagen was tested using hydrothermal isometric tension (HIT) testing in order to measure the collagen's thermal stability (denaturation temperature, T) and network connectivity (maximum rate of isometric tension generation; Max.Slope). We used linear regression and general linear models (GLMs) with several explanatory variables to determine whether relationships between HIT parameters and generally accepted bone quality factors (e.g., cortical porosity, pentosidine content [pen], pyridinoline content [pyd]), age, and measures of fracture toughness (crack initiation fracture toughness, K, and total energy release/dissipation rate evaluated at the point of unstable fast fracture, J-int) were significant. Bone collagen connectivity (Max.Slope) correlated well with the measures of fracture toughness (R = 24-35%), and to a lesser degree with bound water fraction (BW; R = 7.9%) and pore water fraction (PW; R = 9.1%). Significant correlations with age, apparent volumetric bone mineral density (vBMD), and mature enzymatic [pyd] and non-enzymatic collagen crosslinks [pen] were not detected. GLMs found that Max.Slope and vBMD (or BW), with or without age as additional covariate, all significantly explained the variance in Kinit (adjusted-R = 36.7-49.0%). Also, the best-fit model for J-int (adjusted-R = 35.7%) included only age and Max.Slope as explanatory variables with Max.Slope contributing twice as much as age. Max.Slope and BW without age were also significant predictors of J-int (adjusted-R = 35.5%). In conclusion, bone collagen integrity as measured by thermomechanical methods is a key factor in cortical bone fracture toughness. This study further demonstrates that greater attention should be paid to degradation of the overall organic phase, rather than a specific biomarker (e.g. [pen]), when seeking to understand elevated fracture rates in aging and disease.
Copyright © 2018 Elsevier Inc. All rights reserved.
BACKGROUND - Life purpose in acute low back pain patients is not well described in published literature.
METHODS/PURPOSE - We used linear regression models to describe the relationship of life purpose with perceived functional disability and depression in persons with acute low back pain (N = 42) participating in a randomized clinical trial to prevent transition to chronic low back pain.
RESULTS - In our predominantly female sample (81.8%) with a mean age of 53 years (SD = 11.6 years), 52% worked full-time. Adjusting for age, gender, and working status, life purpose was a significant correlate of depression (p = .007). For every 10-unit increase in life purpose score, the estimated depression score decreased by almost 2.5 points. A significant relationship between life purpose and perceived functional disability was not identified.
CONCLUSION - Life purpose likely is a modifiable risk factor for depression in acute low back pain patients.
BACKGROUND - Mechanisms underlying the association between age-related arterial stiffening and poor brain health remain elusive. Cerebral blood flow (CBF) homeostasis may be implicated. This study evaluates how aortic stiffening relates to resting CBF and cerebrovascular reactivity (CVR) in older adults.
METHODS - Vanderbilt Memory & Aging Project participants free of clinical dementia, stroke, and heart failure were studied, including older adults with normal cognition (n=155; age, 72±7 years; 59% male) or mild cognitive impairment (n=115; age, 73±7 years; 57% male). Aortic pulse wave velocity (PWV; meters per second) was quantified from cardiac magnetic resonance. Resting CBF (milliliters per 100 g per minute) and CVR (CBF response to hypercapnic normoxia stimulus) were quantified from pseudocontinuous arterial spin labeling magnetic resonance imaging. Linear regression models related aortic PWV to regional CBF, adjusting for age, race/ethnicity, education, Framingham Stroke Risk Profile (diabetes mellitus, smoking, left ventricular hypertrophy, prevalent cardiovascular disease, atrial fibrillation), hypertension, body mass index, apolipoprotein E4 ( APOE ε4) status, and regional tissue volume. Models were repeated testing PWV× APOE ε4 interactions. Sensitivity analyses excluded participants with prevalent cardiovascular disease and atrial fibrillation.
RESULTS - Among participants with normal cognition, higher aortic PWV related to lower frontal lobe CBF (β=-0.43; P=0.04) and higher CVR in the whole brain (β=0.11; P=0.02), frontal lobes (β=0.12; P<0.05), temporal lobes (β=0.11; P=0.02), and occipital lobes (β=0.14; P=0.01). Among APOE ε4 carriers with normal cognition, findings were more pronounced with higher PWV relating to lower whole-brain CBF (β=-1.16; P=0.047), lower temporal lobe CBF (β=-1.81; P=0.004), and higher temporal lobe CVR (β=0.26; P=0.08), although the last result did not meet the a priori significance threshold. Results were similar in sensitivity models. Among participants with mild cognitive impairment, higher aortic PWV related to lower CBF in the occipital lobe (β=-0.70; P=0.02), but this finding was attenuated when participants with prevalent cardiovascular disease and atrial fibrillation were excluded. Among APOE ε4 carriers with mild cognitive impairment, findings were more pronounced with higher PWV relating to lower temporal lobe CBF (β=-1.20; P=0.02).
CONCLUSIONS - Greater aortic stiffening relates to lower regional CBF and higher CVR in cognitively normal older adults, especially among individuals with increased genetic predisposition for Alzheimer's disease. Central arterial stiffening may contribute to reductions in regional CBF despite preserved cerebrovascular reserve capacity.
We report a web-based tool for analysis of experiments using indirect calorimetry to measure physiological energy balance. CalR simplifies the process to import raw data files, generate plots, and determine the most appropriate statistical tests for interpretation. Analysis using the generalized linear model (which includes ANOVA and ANCOVA) allows for flexibility in interpreting diverse experimental designs, including those of obesity and thermogenesis. Users also may produce standardized output files for an experiment that can be shared and subsequently re-evaluated using CalR. This framework will provide the transparency necessary to enhance consistency, rigor, and reproducibility. The CalR analysis software will greatly increase the speed and efficiency with which metabolic experiments can be organized, analyzed per accepted norms, and reproduced and will likely become a standard tool for the field. CalR is accessible at https://CalRapp.org/.
Copyright © 2018 Elsevier Inc. All rights reserved.
PURPOSE - We sought to determine whether women with overactive bladder who required third line therapy would demonstrate greater central sensitization, indexed by temporal summation to heat pain stimuli, than those with overactive bladder.
MATERIALS AND METHODS - We recruited 39 women with overactive bladder from the urology clinic who were planning to undergo interventional therapy for medication refractory overactive bladder with onabotulinumtoxinA bladder injection or sacral neuromodulation. We also recruited 55 women with overactive bladder who were newly seen at our urology clinic or who responded to advertisements for study participation. Participants underwent quantitative sensory testing using a thermal temporal summation protocol. The primary study outcome was the degree of temporal summation as reflected in the magnitude of positive slope of the line fit to the series of 10 stimuli at a 49C target temperature. We compared the degree of temporal summation between the study groups using linear regression.
RESULTS - Women in the group undergoing third line therapy showed significantly higher standardized temporal summation slopes than those in the nontreatment group (β = 1.57, 95% CI 0.18-2.96, t = 2.25, p = 0.027). On exploratory analyses a history of incontinence surgery or hysterectomy was associated with significantly greater temporal summation.
CONCLUSIONS - In this study the degree of temporal summation was elevated in women undergoing third line overactive bladder therapy compared to women with overactive bladder who were not undergoing that therapy. These findings suggest there may be pathophysiological differences, specifically in afferent nerve function and processing, in some women with overactive bladder.
Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
Diastolic dysfunction (DD), an abnormality in cardiac left ventricular (LV) chamber compliance, is associated with increased morbidity and mortality. Although DD has been extensively studied in older populations, co-morbidity patterns are less well characterized in middle-aged subjects. We screened 156,434 subjects with transthoracic echocardiogram reports available through Vanderbilt's electronic heath record and identified 6,612 subjects 40 to 55 years old with an LV ejection fraction ≥50% and diastolic function staging. We tested 452 incident and prevalent clinical diagnoses for associations with early-stage DD (n = 1,676) versus normal function. There were 44 co-morbid diagnoses associated with grade 1 DD including hypertension (odds ratio [OR] = 2.02, 95% confidence interval [CI] 1.78 to 2.28, p <5.3 × 10-29), type 2 diabetes (OR 1.96, 95% CI 1.68 to 2.29, p = 2.1 × 10-17), tachycardia (OR 1.38, 95% CI 0.53 to 2.19, p = 2.9 × 10-6), obesity (OR 1.76, 95% CI 1.51 to 2.06, p = 1.7 × 10-12), and clinical end points, including end-stage renal disease (OR 3.29, 95% CI 2.19 to 4.96, p = 1.2 × 10-8) and stroke (OR 1.5, 95% CI 1.12 to 2.02, p = 6.9 × 10-3). Among the 60 incident diagnoses associated with DD, heart failure with preserved ejection fraction (OR 4.63, 95% CI 3.39 to 6.32, p = 6.3 × 10-22) had the most significant association. Among subjects with normal diastolic function and blood pressure at baseline, a blood pressure measurement in the hypertensive range at the time of the second echocardiogram was associated with progression to stage 1 DD (p = 0.04). In conclusion, DD was common among subjects 40 to 55 years old and was associated with a heavy burden of co-morbid disease.
Copyright © 2018 Elsevier Inc. All rights reserved.
BACKGROUND - Delirium during critical illness results from numerous insults, which might be interconnected and yet individually contribute to long-term cognitive impairment. We sought to describe the prevalence and duration of clinical phenotypes of delirium (ie, phenotypes defined by clinical risk factors) and to understand associations between these clinical phenotypes and severity of subsequent long-term cognitive impairment.
METHODS - In this multicentre, prospective cohort study, we included adult (≥18 years) medical or surgical ICU patients with respiratory failure, shock, or both as part of two parallel studies: the Bringing to Light the Risk Factors and Incidence of Neuropsychological Dysfunction in ICU Survivors (BRAIN-ICU) study, and the Delirium and Dementia in Veterans Surviving ICU Care (MIND-ICU) study. We assessed patients at least once a day for delirium using the Confusion Assessment Method-ICU and identified a priori-defined, non-mutually exclusive phenotypes of delirium per the presence of hypoxia, sepsis, sedative exposure, or metabolic (eg, renal or hepatic) dysfunction. We considered delirium in the absence of hypoxia, sepsis, sedation, and metabolic dysfunction to be unclassified. 3 and 12 months after discharge, we assessed cognition with the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS). We used multiple linear regression to separately analyse associations between the duration of each phenotype of delirium and RBANS global cognition scores at 3-month and 12-month follow-up, adjusting for potential confounders.
FINDINGS - Between March 14, 2007, and May 27, 2010, 1048 participants were enrolled, eight of whom could not be analysed. Of 1040 participants, 708 survived to 3 months of follow-up and 628 to 12 months. Delirium was common, affecting 740 (71%) of 1040 participants at some point during the study and occurring on 4187 (31%) of all 13 434 participant-days. A single delirium phenotype was present on only 1355 (32%) of all 4187 participant-delirium days, whereas two or more phenotypes were present during 2832 (68%) delirium days. Sedative-associated delirium was most common (present during 2634 [63%] delirium days), and a longer duration of sedative-associated delirium predicted a worse RBANS global cognition score 12 months later, after adjusting for covariates (difference in score comparing 3 days vs 0 days: -4·03, 95% CI -7·80 to -0·26). Similarly, longer durations of hypoxic delirium (-3·76, 95% CI -7·16 to -0·37), septic delirium (-3·67, -7·13 to -0·22), and unclassified delirium (-4·70, -7·16 to -2·25) also predicted worse cognitive function at 12 months, whereas duration of metabolic delirium did not (1·14, -0·12 to 3·01).
INTERPRETATION - Our findings suggest that clinicians should consider sedative-associated, hypoxic, and septic delirium, which often co-occur, as distinct indicators of acute brain injury and seek to identify all potential risk factors that may impact on long-term cognitive impairment, especially those that are iatrogenic and potentially modifiable such as sedation.
FUNDING - National Institutes of Health and the Department of Veterans Affairs.
Copyright © 2018 Elsevier Ltd. All rights reserved.
Background - Trauma-related hospitalizations drive a high percentage of health care expenditure and inpatient resource consumption, which is directly related to length of stay (LOS). Robust and reliable interactions among health care employees can reduce LOS. However, there is little known about whether certain patterns of interactions exist and how they relate to LOS and its variability. The objective of this study is to learn interaction patterns and quantify the relationship to LOS within a mature trauma system and long-standing electronic medical record (EMR).
Methods - We adapted a spectral co-clustering methodology to infer the interaction patterns of health care employees based on the EMR of 5588 hospitalized adult trauma survivors. The relationship between interaction patterns and LOS was assessed via a negative binomial regression model. We further assessed the influence of potential confounders by age, number of health care encounters to date, number of access action types care providers committed to patient EMRs, month of admission, phenome-wide association study codes, procedure codes, and insurance status.
Results - Three types of interaction patterns were discovered. The first pattern exhibited the most collaboration between employees and was associated with the shortest LOS. Compared to this pattern, LOS for the second and third patterns was 0.61 days (P = 0.014) and 0.43 days (P = 0.037) longer, respectively. Although the 3 interaction patterns dealt with different numbers of patients in each admission month, our results suggest that care was provided for similar patients.
Discussion - The results of this study indicate there is an association between LOS and the extent to which health care employees interact in the care of an injured patient. The findings further suggest that there is merit in ascertaining the content of these interactions and the factors that induce these differences in interaction patterns within a trauma system.
Computational protein design has been successful in modeling fixed backbone proteins in a single conformation. However, when modeling large ensembles of flexible proteins, current methods in protein design have been insufficient. Large barriers in the energy landscape are difficult to traverse while redesigning a protein sequence, and as a result current design methods only sample a fraction of available sequence space. We propose a new computational approach that combines traditional structure-based modeling using the Rosetta software suite with machine learning and integer linear programming to overcome limitations in the Rosetta sampling methods. We demonstrate the effectiveness of this method, which we call BROAD, by benchmarking the performance on increasing predicted breadth of anti-HIV antibodies. We use this novel method to increase predicted breadth of naturally-occurring antibody VRC23 against a panel of 180 divergent HIV viral strains and achieve 100% predicted binding against the panel. In addition, we compare the performance of this method to state-of-the-art multistate design in Rosetta and show that we can outperform the existing method significantly. We further demonstrate that sequences recovered by this method recover known binding motifs of broadly neutralizing anti-HIV antibodies. Finally, our approach is general and can be extended easily to other protein systems. Although our modeled antibodies were not tested in vitro, we predict that these variants would have greatly increased breadth compared to the wild-type antibody.