, a bio/informatics shared resource is still "open for business" - Visit the CDS website
The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
BACKGROUND - Patients with hospitalized acute kidney injury (AKI) are at increased risk for accelerated loss of kidney function, morbidity, and mortality. We sought to inform efforts at improving post-AKI outcomes by describing the receipt of renal-specific laboratory test surveillance among a large high-risk cohort.
METHODS - We acquired clinical data from the Electronic health record (EHR) of 5 Veterans Affairs (VA) hospitals to identify patients hospitalized with AKI from January 1st, 2002 to December 31st, 2009, and followed these patients for 1 year or until death, enrollment in palliative care, or improvement in renal function to estimated GFR (eGFR) ≥ 60 L/min/1.73 m(2). Using demographic data, administrative codes, and laboratory test data, we evaluated the receipt and timing of outpatient testing for serum concentrations of creatinine and any as well as quantitative proteinuria recommended for CKD risk stratification. Additionally, we reported the rate of phosphorus and parathyroid hormone (PTH) monitoring recommended for chronic kidney disease (CKD) patients.
RESULTS - A total of 10,955 patients admitted with AKI were discharged with an eGFR<60 mL/min/1.73 m2. During outpatient follow-up at 90 and 365 days, respectively, creatinine was measured on 69% and 85% of patients, quantitative proteinuria was measured on 6% and 12% of patients, PTH or phosphorus was measured on 10% and 15% of patients.
CONCLUSIONS - Measurement of creatinine was common among all patients following AKI. However, patients with AKI were infrequently monitored with assessments of quantitative proteinuria or mineral metabolism disorder, even for patients with baseline kidney disease.
PURPOSE - To examine potential modifying effects of body weight and bilateral oophorectomy on the association of hormone replacement therapy (HRT) with risk of breast cancer, overall and by subtypes according to status of estrogen receptor (ER), progesterone receptor (PR), and human epidermal growth factor receptor 2 (Her2) among postmenopausal women.
EXPERIMENTAL DESIGN - This analysis included 2,510 postmenopausal white women recruited in the Nashville Breast Health Study, a population-based case-control study of breast cancer. Multivariable logistic regression was used to estimate ORs and 95% confidence intervals (CI) for associations between HRT use and risk of breast cancer overall and by subtypes, adjusted for age and education.
RESULTS - Among women with natural menopause and body mass index (BMI) < 25 kg/m(2), ever-use of HRT was associated with increased breast cancer risk (OR, 1.95; 95% CI, 1.32-2.88). Risk was elevated with duration of HRT use (P for trend = 0.002). Similar association patterns were found for ER(+), ER(+)PR(+), and luminal A cancer subtypes but not ER(-), ER(-)PR(-), and triple-negative cancer. In contrast, ever-HRT use in overweight women (BMI ≥ 25 kg/m(2)) showed no association with risk of breast cancer overall or by subtypes; interaction tests for modifying effect of BMI were statistically significant. Ever-HRT use was associated with decreased breast cancer risk (OR, 0.70; 95% CI, 0.38-1.31) among women with prior bilateral oophorectomy but elevated risk (OR, 1.45; 95% CI, 0.92-2.29) among those with hysterectomy without bilateral oophorectomy (P for interaction = 0.057). Similar associations were seen for virtually all breast cancer subtypes, although interaction tests were statistically significant for ER(+) and luminal A only.
CONCLUSION - Body weight and bilateral oophorectomy modify associations between HRT use and breast cancer risk, especially the risk of hormone receptor-positive tumors.
OBJECTIVE - We sought to determine the impact of rotavirus vaccine implementation on gastroenteritis (GE)-related calls to a large telephone triage service in Tennessee.
METHODS - Total and GE-related calls received by the Vanderbilt Telephone Triage Program for children <5 years of age were examined from May 1, 2004 to April 30, 2010. Time series adapted Poisson regression models were used to compare weekly GE-related call proportions between the prevaccine (May 2004 to April 2007) and postlicensure (May 2007 to April 2010) periods. Separate models compared GE-related call proportions in the historical rotavirus (February to April) and nonrotavirus (May to January) seasons. Associations between call data and laboratory-confirmed rotavirus detections and regionally reported norovirus activity were also assessed.
RESULTS - There were 156362 total calls and 19731 GE-related calls. Annual GE-related call proportions declined by 8% (95% confidence interval, 3%-12%) in the postlicensure period; declines ranging from 23% to 31% occurred during the historical rotavirus season in all 3 postlicensure years. No declines occurred in the nonrotavirus season. After vaccine licensure, reductions in laboratory-confirmed rotavirus activity were associated with declines in GE-related call proportions. Peak GE-related call proportions in the postlicensure period occurred earlier than in prevaccine years and were not strongly associated with laboratory-confirmed rotavirus but instead showed good correlation with norovirus outbreaks.
CONCLUSIONS - A decline in GE-related call proportions among young children after rotavirus vaccine licensure was documented by using a novel surveillance platform that captures mild GE not detected in other surveillance systems. Since rotavirus vaccine licensure, peak call proportions correlate with regional norovirus activity, highlighting the role of that pathogen in community GE.