The publication data currently available has been vetted by Vanderbilt faculty, staff, administrators and trainees. The data itself is retrieved directly from NCBI's PubMed and is automatically updated on a weekly basis to ensure accuracy and completeness.
If you have any questions or comments, please contact us.
OBJECTIVES - Blood culture contamination is a common problem in the emergency department (ED) that leads to unnecessary patient morbidity and health care costs. The study objective was to develop and evaluate the effectiveness of a quality improvement (QI) intervention for reducing blood culture contamination in an ED.
METHODS - The authors developed a QI intervention to reduce blood culture contamination in the ED and then evaluated its effectiveness in a prospective interrupted times series study. The QI intervention involved changing the technique of blood culture specimen collection from the traditional clean procedure to a new sterile procedure, with standardized use of sterile gloves and a new materials kit containing a 2% chlorhexidine skin antisepsis device, a sterile fenestrated drape, a sterile needle, and a procedural checklist. The intervention was implemented in a university-affiliated ED and its effect on blood culture contamination evaluated by comparing the biweekly percentages of blood cultures contaminated during a 48-week baseline period (clean technique) and 48-week intervention period (sterile technique), using segmented regression analysis with adjustment for secular trends and first-order autocorrelation. The goal was to achieve and maintain a contamination rate below 3%.
RESULTS - During the baseline period, 321 of 7,389 (4.3%) cultures were contaminated, compared to 111 of 6,590 (1.7%) during the intervention period (p < 0.001). In the segmented regression model, the intervention was associated with an immediate 2.9% (95% confidence interval [CI] = 2.2% to 3.2%) absolute reduction in contamination. The contamination rate was maintained below 3% during each biweekly interval throughout the intervention period.
CONCLUSIONS - A QI assessment of ED blood culture contamination led to development of a targeted intervention to convert the process of blood culture collection from a clean to a fully sterile procedure. Implementation of this intervention led to an immediate and sustained reduction of contamination in an ED with a high baseline contamination rate.
© 2013 by the Society for Academic Emergency Medicine.
Biofilm formation constitutes an alternative lifestyle in which microorganisms adopt a multicellular behavior that facilitates and/or prolongs survival in diverse environmental niches. Biofilms form on biotic and abiotic surfaces both in the environment and in the healthcare setting. In hospital wards, the formation of biofilms on vents and medical equipment enables pathogens to persist as reservoirs that can readily spread to patients. Inside the host, biofilms allow pathogens to subvert innate immune defenses and are thus associated with long-term persistence. Here we provide a general review of the steps leading to biofilm formation on surfaces and within eukaryotic cells, highlighting several medically important pathogens, and discuss recent advances on novel strategies aimed at biofilm prevention and/or dissolution.
BACKGROUND CONTEXT - Postoperative spine infections have been reported to occur in 1% to 15% of patients and subsequently lead to significant morbidity and cost, with an elevated risk for instrumented cases. Every effort should be made to minimize the risk of intraoperative wound contamination. Consequently, certain practices are followed in the operating room to prevent contamination, many of which are not evidence based. Conversely, certain objects believed to be sterile are frequently overlooked as potential sources of contamination.
PURPOSE - To assess to what degree contamination of spinal implants occurs during spine surgery and evaluate whether coverage of implants alters the rate of contamination.
STUDY DESIGN - Prospective study.
STUDY SAMPLE - This study included 105 consecutive noninfection surgical cases performed by a single spine surgeon that required the use of instrumentation.
OUTCOME MEASURE - Spinal implant contamination.
METHODS - Cases were randomized to have all implant trays either remain uncovered (n=54) or covered (n=51) with sterile surgical towels on opening until implants were required for the case. After the last implant was placed, a sterile culture swab was used to obtain a sample from all open implants that had been present at the start of the case. The paper outer wraps of the implant trays were sampled in each case as a positive control, and an additional 105 swabs were capped immediately after they were opened to obtain negative controls. Swab samples were assessed for bacterial growth on 5% sheep blood Columbia agar plates. Of note, only departmental funding was used and no applicable financial relationships exist with any author.
RESULTS - No growth was observed on any of the 105 negative controls, whereas 99.1% of positive controls demonstrated obvious contamination. Cultures from implant samples demonstrated a 9.5% overall rate of contamination with 2.0% (n=1) of covered implants versus 16.7% (n=9) of uncovered implants demonstrating contamination. Length of time implant trays were open before sample collection; implant type (plate, rods, vs. polyetheretherketone), number of scrubbed personnel, and number of implants used were all not found to be significantly associated with implant contamination (p>.05). However, coverage of implants was found to significantly reduce the implant contamination rate (p=.016).
CONCLUSIONS - The contamination of sterile implants during spine surgery was found to occur. However, this contamination was independent of the amount of time the implant trays remained open. Coverage of implants significantly reduces this contamination. Therefore, no matter the expected duration of a case, implant coverage is a simple modifiable way to reduce the risk of intraoperative wound contamination and potentially reduce postoperative infections.
Copyright © 2013 Elsevier Inc. All rights reserved.
STUDY DESIGN - Prospective study.
OBJECTIVE - Assess the contamination rates of sterile microscope drapes after spine surgery.
SUMMARY OF BACKGROUND DATA - The use of the operating microscope has become more prevalent in certain spine procedures, providing superior magnification, visualization, and illumination of the operative field. However, it may represent an additional source of bacterial contamination and increase the risk of developing a postoperative infection.
METHODS - This study included 25 surgical spine cases performed by a single spine surgeon that required the use of the operative microscope. Sterile culture swabs were used to obtain samples from 7 defined locations on the microscope drape after its use during the operation. The undraped technician's console was sampled in each case as a positive control, and an additional 25 microscope drapes were swabbed immediately after they were applied to the microscope to obtain negative controls. Swab samples were assessed for bacterial growth on 5% sheep blood Columbia agar plates using a semiquantitative technique.
RESULTS - No growth was observed on any of the 25 negative control drapes. In contrast, 100% of preoperative and 96% of postoperative positive controls demonstrated obvious contamination. In the postoperative group, all 7 sites of evaluation were found to be contaminated with rates of 12% to 44%. Four of the 7 evaluated locations were found to have significant contamination rates compared with negative controls, including the shafts of the optic eyepieces on the main surgeon side (24%, P = 0.022), "forehead" portion on both the main surgeon (24%, P = 0.022) and assistant sides (28%, P = 0.010), and "overhead" portion of the drape (44%, P = 0.0002).
CONCLUSION - Bacterial contamination of the operative microscope was found to be significant after spine surgery. Contamination was more common around the optic eyepieces, likely due to inadvertent touching of unsterile portions. Similarly, all regions above the eyepieces also have a propensity for contamination because of unknown contact with unsterile parts of the surgeon. Therefore, we believe that changing gloves after making adjustments to the optic eyepieces and avoid handling any portion of the drape above the eyepieces may decrease the risks of intraoperative contamination and possibly postoperative infection as well.
BACKGROUND AND OBJECTIVES - Higher mortality risk reported with reuse versus single use of dialyzers is potentially related to reuse reagents that modify membrane surface characteristics and the blood-membrane interface. A key mechanism may involve stimulation of an inflammatory response.
DESIGN, SETTING, PARTICIPANTS, & MEASUREMENTS - In a prospective crossover design, laboratory markers and mortality from 23 hemodialysis facilities abandoning reuse with peracetic acid mixture were tracked. C-reactive protein (CRP), white blood cell (WBC) count, albumin, and prealbumin were measured for 2 consecutive months before abandoning reuse and subsequently within 3 and 6 months on single use. Survival models were utilized to compare the 6-month period before abandoning reuse (baseline) and the 6-month period on single use of dialyzers after a 3-month "washout period."
RESULTS - Patients from baseline and single-use periods had a mean age of approximately 63 years; 44% were female, 54% were diabetic, 60% were white, and the mean vintage was approximately 3.2 years. The unadjusted hazard ratio for death was 0.70 and after case-mix adjustment was 0.74 for single use compared with reuse. Patients with CRP≥5 mg/L during reuse (mean CRP=26.6 mg/ml in April) declined on single use to 20.2 mg/L by August and 20.4 mg/L by November. WBC count declined slightly during single use, but nutritional markers were unchanged.
CONCLUSIONS - Abandonment of peracetic-acid-based reuse was associated with improved survival and lower levels of inflammatory but not nutritional markers. Further study is needed to evaluate a potential link between dialyzer reuse, inflammation, and mortality.
BACKGROUND - Patients receiving long-term home parenteral nutrition (HPN) have catheter-related infections, reactive depression, and other recurrent problems that decrease their quality of life. The aim of this study was to evaluate the Interactive Educational Videotaped Interventions (IEVI) designed to prevent HPN complications of catheter-related bloodstream infection (CR-BSI), to prevent reactive depression (from Diagnostic and Statistical Manual of Mental Disorders, 4th Edition definition), and to increase patients' frequency of problem-solving with professionals.
METHODS - A randomized placebo-controlled clinical trial was used to test IEVI that engaged patients in infection and depression prevention and problem-solving activities with professionals. The primary outcome measure was CR-BSIs, while reactive depression and problem solving were secondary outcomes. Quality of life and satisfaction with interventions, also secondary outcomes, were evaluated at 18 months.
RESULTS - Compared with controls, there was a lower frequency in the experimental group of CRBSIs (chi2 = 4.82, p = .03), reactive depression (chi2 = 4.50, p = .03), and rehospitalization for CR-BSIs (chi2 = 5.73, p = .01). There was greater use of problem solving in the experimental group (chi2 = 4.33, p = .038). These differences occurred at the primary endpoint of 6 months after administration of the interventions. At the 18-month follow-up, there were fewer CR-BSIs (chi2 = 4.42, p = .035), and fewer hospitalizations for infection (chi2 = 5.729, p = .01).
CONCLUSIONS - The IEVI reduced CR-BSIs and reactive depression and increased problem solving with professionals. IEVI use also can result in fewer hospitalizations and improved quality of life. Long-term improvement did not occur for reactive depression and problem-solving outcomes because patients used these less often.
In October 2001, two inhalational anthrax and four cutaneous anthrax cases, resulting from the processing of Bacillus anthracis-containing envelopes at a New Jersey mail facility, were identified. Subsequently, we initiated stimulated passive hospital-based and enhanced passive surveillance for anthrax-compatible syndromes. From October 24 to December 17, 2001, hospitals reported 240,160 visits and 7,109 intensive-care unit admissions in the surveillance area (population 6.7 million persons). Following a change of reporting criteria on November 8, the average of possible inhalational anthrax reports decreased 83% from 18 to 3 per day; the proportion of reports requiring follow-up increased from 37% (105/286) to 41% (47/116). Clinical follow-up was conducted on 214 of 464 possible inhalational anthrax patients and 98 possible cutaneous anthrax patients; 49 had additional laboratory testing. No additional cases were identified. To verify the limited scope of the outbreak, surveillance was essential, though labor-intensive. The flexibility of the system allowed interim evaluation, thus improving surveillance efficiency.
BACKGROUND - Procedure instruction for physicians-in-training is usually nonstandardized. The authors observed that during insertion of central venous catheters (CVCs), few physicians used full-size sterile drapes (an intervention proven to reduce the risk for CVC-related infection).
OBJECTIVE - To improve standardization of infection control practices and techniques during invasive procedures.
DESIGN - Nonrandomized pre-post observational trial.
SETTING - Six intensive care units and one step-down unit at Wake Forest University Baptist Medical Center, Winston-Salem, North Carolina.
PARTICIPANTS - Third-year medical students and physicians completing their first postgraduate year.
INTERVENTION - A 1-day course on infection control practices and procedures given in June 1996 and June 1997.
MEASUREMENTS - Surveys assessing physician attitudes toward use of sterile techniques during insertion of CVCs were administered during the baseline year and just before, immediately after, and 6 months after the first course. Preintervention and postintervention use of full-size sterile drapes was measured, and surveillance for vascular catheter-related infection was performed.
RESULTS - The perceived need for full-size sterile drapes was 22% in the year before the course and 73% 6 months after the course (P < 0.001). The perceived need for small sterile towels at the insertion site decreased reciprocally (P < 0.001). Documented use of full-size sterile drapes increased from 44% to 65% (P < 0.001). The rate of catheter-related infection decreased from 4.51 infections per 1000 patient-days before the first course to 2.92 infections per 1000 patient-days 18 months after the first course (average decrease, 3.23 infections per 1000 patient-days; P < 0.01). The estimated cost savings of this 28% decrease was at least $63000 and may have exceeded $800000.
CONCLUSIONS - Standardization of infection control practices through a course is a cost-effective way to decrease related adverse outcomes. If these findings can be reproduced, this approach may serve as a model for physicians-in-training.