The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
Importance - Polygenic risk scores comprising millions of single-nucleotide polymorphisms (SNPs) could be useful for population-wide coronary heart disease (CHD) screening.
Objective - To determine whether a polygenic risk score improves prediction of CHD compared with a guideline-recommended clinical risk equation.
Design, Setting, and Participants - A retrospective cohort study of the predictive accuracy of a previously validated polygenic risk score was assessed among 4847 adults of white European ancestry, aged 45 through 79 years, participating in the Atherosclerosis Risk in Communities (ARIC) study and 2390 participating in the Multi-Ethnic Study of Atherosclerosis (MESA) from 1996 through December 31, 2015, the final day of follow-up. The performance of the polygenic risk score was compared with that of the 2013 American College of Cardiology and American Heart Association pooled cohort equations.
Exposures - Genetic risk was computed for each participant by summing the product of the weights and allele dosage across 6 630 149 SNPs. Weights were based on an international genome-wide association study.
Main Outcomes and Measures - Prediction of 10-year first CHD events (including myocardial infarctions, fatal coronary events, silent infarctions, revascularization procedures, or resuscitated cardiac arrest) assessed using measures of model discrimination, calibration, and net reclassification improvement (NRI).
Results - The study population included 4847 adults from the ARIC study (mean [SD] age, 62.9 [5.6] years; 56.4% women) and 2390 adults from the MESA cohort (mean [SD] age, 61.8 [9.6] years; 52.2% women). Incident CHD events occurred in 696 participants (14.4%) and 227 participants (9.5%), respectively, over median follow-up of 15.5 years (interquartile range [IQR], 6.3 years) and 14.2 (IQR, 2.5 years) years. The polygenic risk score was significantly associated with 10-year CHD incidence in ARIC with hazard ratios per SD increment of 1.24 (95% CI, 1.15 to 1.34) and in MESA, 1.38 (95% CI, 1.21 to 1.58). Addition of the polygenic risk score to the pooled cohort equations did not significantly increase the C statistic in either cohort (ARIC, change in C statistic, -0.001; 95% CI, -0.009 to 0.006; MESA, 0.021; 95% CI, -0.0004 to 0.043). At the 10-year risk threshold of 7.5%, the addition of the polygenic risk score to the pooled cohort equations did not provide significant improvement in reclassification in either ARIC (NRI, 0.018, 95% CI, -0.012 to 0.036) or MESA (NRI, 0.001, 95% CI, -0.038 to 0.076). The polygenic risk score did not significantly improve calibration in either cohort.
Conclusions and Relevance - In this analysis of 2 cohorts of US adults, the polygenic risk score was associated with incident coronary heart disease events but did not significantly improve discrimination, calibration, or risk reclassification compared with conventional predictors. These findings suggest that a polygenic risk score may not enhance risk prediction in a general, white middle-aged population.
HIV and hepatitis C virus (HCV) coinfection is associated with poor health outcomes. This study was designed to assess risk factors for and mortality with coinfection before direct-acting antiviral treatment availability in a state with an evolving opioid epidemic. HCV infection was determined from review of the medical record at two clinics serving the majority of people living with HIV (PLWH) in care in Middle Tennessee from 2004 to 2013. Association of potential risk factors with HCV-positivity was assessed using logistic regression. Association of HCV-positivity with mortality was assessed with a Cox proportional hazards model, adjusting for selected covariates. A total of 3,501 patients were included: 24% female; 51% men who have sex with men; 47% white; 44% African American/black; median age of 38 at their first visit; median most recent CD4 count 502 cells/μL (301-716); and HIV viral load 47 copies/mL (39-605); followed for a median of 3.0 (1-5) years. Prevalence of HCV was 13%. Those with a history of injection drug use (IDU) demonstrated the highest odds of HCV-positivity [odds ratio 12.94; 95% confidence interval (CI) 9.39-17.83]. There were 305 deaths; median age at death was 47 years (40-53). HCV coinfection was associated with greater mortality (hazard ratio 1.61; 95% CI 1.20-2.17; < .001). Among PLWH, HCV coinfection was associated with IDU and an independent predictor of mortality. These results affirm the importance of HCV coinfection and inform interventions targeting the continuum of HCV care, uptake of HCV treatment, and the impact of drug use in this population.
The stress response system is disrupted in individuals with major depressive disorder (MDD) as well as in those at elevated risk for developing MDD. We examined whether DNA methylation (DNAm) levels of CpG sites within HPA-axis genes predict the onset of MDD. Seventy-seven girls, approximately half (n = 37) of whom were at familial risk for MDD, were followed longitudinally. Saliva samples were taken in adolescence (M age = 13.06 years [SD = 1.52]) when participants had no current or past MDD diagnosis. Diagnostic interviews were administered approximately every 18 months until the first onset of MDD or early adulthood (M age of last follow-up = 19.23 years [SD = 2.69]). We quantified DNAm in saliva samples using the Illumina EPIC chip and examined CpG sites within six key HPA-axis genes (NR3C1, NR3C2, CRH, CRHR1, CRHR2, FKBP5) alongside 59 genotypes for tagging SNPs capturing cis genetic variability. DNAm levels within CpG sites in NR3C1, CRH, CRHR1, and CRHR2 were associated with risk for MDD across adolescence and young adulthood. To rule out the possibility that findings were merely due to the contribution of genetic variability, we re-analyzed the data controlling for cis genetic variation within these candidate genes. Importantly, methylation levels in these CpG sites continued to significantly predict the onset of MDD, suggesting that variation in the epigenome, independent of proximal genetic variants, prospectively predicts the onset of MDD. These findings suggest that variation in the HPA axis at the level of the methylome may predict the development of MDD.
BACKGROUND - It remains unclear whether sepsis-related cardiovascular complications have an adverse impact on survival independent of pre-existing comorbidities. To investigate the survival impact of post-sepsis cardiovascular complications among sepsis survivors, we conducted a population-based study using the National Health Insurance Database of Taiwan.
METHODS - We identified sepsis patients from the National Health Insurance Research Database of Taiwan using ICD-9-CM codes involving infection and organ dysfunction between 2000 and 2011. Post-sepsis incident myocardial infarction (MI) and stroke were ascertained by ICD-9-CM codes and antiplatelet treatment. We constructed a non-sepsis comparison cohort using propensity score matching to ascertain the association between sepsis and cardiovascular complications. Furthermore, we compared the 180-day mortality and 365-day mortality between patients surviving sepsis with or without post-sepsis MI or stroke within 70 days of hospital discharge. We constructed Cox regression models adjusting for pre-existing comorbidities to evaluate the independent survival impact of post-sepsis MI or stroke among sepsis survivors.
RESULTS - We identified 42,316 patients hospitalized for sepsis, from which we matched 42,151 patients 1:1 with 42,151 patients hospitalized without sepsis. Compared to patients hospitalized without sepsis, patients hospitalized with sepsis had an increased risk of MI or stroke (adjusted odds ratio 1.72, 95% CI 1.60-1.85). Among 42,316 patients hospitalized for sepsis, 486 (1.15%) patients developed incident stroke and 108 (0.26%) developed incident MI within 70 days of hospital discharge. Compared to sepsis survivors without cardiovascular complications, sepsis survivors with incident MI or stroke had a higher mortality rate at 180 days (11.68% vs. 4.44%, P = 0.003) and at 365 days (16.75% vs. 7.11%, P = 0.005). Adjusting for age, sex, and comorbidities, post-sepsis MI or stroke was independently associated with increased 180-day (adjusted hazard ratio [HR] 2.16, 95% CI 1.69-2.76) and 365-day (adjusted HR 1.90, 95% CI 1.54-2.32) mortality.
CONCLUSIONS - Compared to sepsis patients without incident MI or stroke, sepsis patients with incident MI or stroke following hospital discharge had an increased risk of mortality for up to 365 days of follow-up. This increased risk cannot be explained by pre-sepsis comorbidities.
STUDY OBJECTIVE - The potential for maternal antidepressant use to influence the risk of spontaneous abortion, one of the most important adverse pregnancy outcomes, is not clear. We aimed to assess whether first trimester antidepressant exposure was associated with an increased risk of spontaneous abortion.
DESIGN - Community-based prospective cohort study (Right from the Start).
SETTING - Eight metropolitan areas in North Carolina, Tennessee, and Texas.
PARTICIPANTS - A total of 5451 women (18 years of age or older) who were planning to conceive or were pregnant (before 12 weeks of completed gestation) and were enrolled in the study between 2000 and 2012; of those women, 223 used antidepressants (selective serotonin reuptake inhibitors [SSRIs] only , SSRIs and non-SSRIs , and non-SSRIs only ) during their first trimester, and 5228 did not (never users). Measurements and Main Results First trimester antidepressant use was determined during a first trimester telephone interview. Spontaneous abortion was self-reported and verified by medical records. The association of first trimester antidepressant use and spontaneous abortion was assessed by using Cox proportional hazard regression. Among the 5451 women enrolled, 223 (4%) reported first trimester antidepressant use, and 659 (12%) experienced a spontaneous abortion. SSRIs were the most common class of antidepressants used (179 [80%]). Compared with women who never used antidepressants during the first trimester of pregnancy, women who reported antidepressant use were 34% (adjusted hazard ratio [aHR] 1.34, 95% confidence interval [CI] 0.97-1.85) more likely to experience a spontaneous abortion after adjusting for covariates. Women who reported ever using SSRIs were 45% (aHR 1.45, 95% CI 1.02-2.06) more likely to experience a spontaneous abortion compared with never users. When time of loss relative to the time of interview was taken into consideration, the association between first trimester SSRI use and spontaneous abortion was significant only among those with losses before the interview (aHR 1.49, 95% CI 1.04-2.13) but was not significant among those with losses after the interview (aHR 0.43, 95% CI 0.06-3.15).
CONCLUSION - The association between use of first trimester antidepressants, particularly SSRI use, and spontaneous abortion was significant only among women whose exposure status was assessed after loss. In this instance, reporting bias may create a spurious association. Future studies should take the timing of data collection relative to the timing of loss into consideration.
© 2019 Pharmacotherapy Publications, Inc.
BACKGROUND - Circulating biomarkers can facilitate diagnosis and risk stratification for complex conditions such as heart failure (HF). Newer molecular platforms can accelerate biomarker discovery, but they require significant resources for data and sample acquisition.
OBJECTIVES - The purpose of this study was to test a pragmatic biomarker discovery strategy integrating automated clinical biobanking with proteomics.
METHODS - Using the electronic health record, the authors identified patients with and without HF, retrieved their discarded plasma samples, and screened these specimens using a DNA aptamer-based proteomic platform (1,129 proteins). Candidate biomarkers were validated in 3 different prospective cohorts.
RESULTS - In an automated manner, plasma samples from 1,315 patients (31% with HF) were collected. Proteomic analysis of a 96-patient subset identified 9 candidate biomarkers (p < 4.42 × 10). Two proteins, angiopoietin-2 and thrombospondin-2, were associated with HF in 3 separate validation cohorts. In an emergency department-based registry of 852 dyspneic patients, the 2 biomarkers improved discrimination of acute HF compared with a clinical score (p < 0.0001) or clinical score plus B-type natriuretic peptide (p = 0.02). In a community-based cohort (n = 768), both biomarkers predicted incident HF independent of traditional risk factors and N-terminal pro-B-type natriuretic peptide (hazard ratio per SD increment: 1.35 [95% confidence interval: 1.14 to 1.61; p = 0.0007] for angiopoietin-2, and 1.37 [95% confidence interval: 1.06 to 1.79; p = 0.02] for thrombospondin-2). Among 30 advanced HF patients, concentrations of both biomarkers declined (80% to 84%) following cardiac transplant (p < 0.001 for both).
CONCLUSIONS - A novel strategy integrating electronic health records, discarded clinical specimens, and proteomics identified 2 biomarkers that robustly predict HF across diverse clinical settings. This approach could accelerate biomarker discovery for many diseases.
Copyright © 2019 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Using an ORF kinome screen in MCF-7 cells treated with the CDK4/6 inhibitor ribociclib plus fulvestrant, we identified FGFR1 as a mechanism of drug resistance. FGFR1-amplified/ER+ breast cancer cells and MCF-7 cells transduced with FGFR1 were resistant to fulvestrant ± ribociclib or palbociclib. This resistance was abrogated by treatment with the FGFR tyrosine kinase inhibitor (TKI) lucitanib. Addition of the FGFR TKI erdafitinib to palbociclib/fulvestrant induced complete responses of FGFR1-amplified/ER+ patient-derived-xenografts. Next generation sequencing of circulating tumor DNA (ctDNA) in 34 patients after progression on CDK4/6 inhibitors identified FGFR1/2 amplification or activating mutations in 14/34 (41%) post-progression specimens. Finally, ctDNA from patients enrolled in MONALEESA-2, the registration trial of ribociclib, showed that patients with FGFR1 amplification exhibited a shorter progression-free survival compared to patients with wild type FGFR1. Thus, we propose breast cancers with FGFR pathway alterations should be considered for trials using combinations of ER, CDK4/6 and FGFR antagonists.
BACKGROUND - Incidence of oral tongue squamous cell carcinoma (OTC) is rising among those under age 50 years. The etiology is unknown.
METHODS - A total of 395 cases of OTC diagnosed and/or treated at Vanderbilt University Medical Center between 2000 and 2017 were identified. Of those, 113 (28.6%) were early onset (age < 50 years). Logistic regression was used to identify factors associated with early onset OTC. Cox proportional hazards models evaluated survival and recurrence.
RESULTS - Compared to typical onset patients, patients with early onset OTC were more likely to receive multimodality treatment (surgery and radiation; adjusted odds ratio [aOR], 2.7; 95% confidence interval [CI], 1.2-6.3) and report a history of snuff use (aOR, 5.4; 95% CI, 1.8-15.8) and were less likely to report a history of cigarette use (aOR, 0.5; 95% CI, 0.2-0.9). Early onset patients had better overall survival (adjusted hazard ratio, 0.6).
CONCLUSIONS - This is the largest study to evaluate factors associated with early onset OTC and the first to report an association with snuff.
© 2019 Wiley Periodicals, Inc.
OBJECTIVES - To investigate the association between green tea intake and incident stones in two large prospective cohorts.
METHODS - We examined self-reported incident kidney stone risk in the Shanghai Men's Health Study (n = 58 054; baseline age 40-74 years) and the Shanghai Women's Health Study (n = 69 166; baseline age 40-70 years). Information on the stone history and tea intake was collected by in-person surveys. Multivariable Cox proportional hazards models were adjusted for baseline demographic variables, medical history and dietary intakes including non-tea oxalate from a validated food frequency questionnaire.
RESULTS - During 319 211 and 696 950 person-years of follow up, respectively, 1202 men and 1451 women reported incident stones. Approximately two-thirds of men and one-quarter of women were tea drinkers at baseline, of whom green tea was the primary type consumed (95% in men, 88% in women). Tea drinkers (men: hazard ratio 0.78, 95% confidence interval 0.69-0.88; women: hazard ratio 0.8, 95% confidence interval 0.77-0.98) and specifically green tea drinkers (men: hazard ratio 0.78, 95% confidence interval 0.69-0.88; women: hazard ratio 0.84, 95% confidence interval 0.74-0.95) had lower incident risk than never/former drinkers. Compared with never/former drinkers, a stronger dose-response trend was observed for the amount of dried tea leaf consumed/month by men (hazard ratio 0.67, 95% confidence interval 0.56-0.80, P < 0.001) than by women (hazard ratio 0.87, 95% confidence interval 0.70-1.08, P = 0.041).
CONCLUSIONS - Green tea intake is associated with a lower risk of incident kidney stones, and the benefit is observed more strongly among men.
© 2018 The Japanese Urological Association.
AIMS - Well-differentiated small intestinal neuroendocrine tumours (SI-NETs) are often multifocal, and this has been suggested to impart worse disease-free survival. Practice guidelines have not been established for World Health Organisation (WHO) grading of multiple primary lesions.
METHODS AND RESULTS - We identified 68 patients with ileal/jejunal SI-NET for a combined total of 207 primary lesions. Each case was evaluated for patient age and sex; size of all tumours; presence of lymph node metastases, mesenteric tumour deposits or distant metastases; and disease-specific outcome. Ki67 staining was performed on all 207 primary lesions. The relationship between multifocality and clinicopathological factors was compared using Fisher's exact test. Outcome was tested using Cox proportional hazard regression. Forty-two patients had unifocal disease, and 26 had multifocal disease (median five lesions, range = 2-32). Most tumours were WHO grade 1 (201 of 207, 97%). Of the five patients with grades 2/3 tumours, three patients had unifocal disease, one patient had two subcentimetre grade 2 lesions (including the largest) and eight subcentimetre grade 1 lesions, and one patient had one 1.6-cm grade 3 lesion and one subcentimetre grade 1 lesion. There was a positive correlation between tumour size and Ki67 index (coefficient 0.28; 95% confidence interval 0.05-0.52, P = 0.017). There was no significant association between multifocality and nodal metastases, mesenteric tumour deposits, distant metastases or disease-specific survival.
CONCLUSIONS - In patients with multifocal SI-NET, unless a particular lesion has a high mitotic rate, only staining the largest lesion for Ki67 should serve to grade almost all cases accurately. Multifocality does not appear to significantly impact patient survival.
© 2018 John Wiley & Sons Ltd.