INTRODUCTION - Few valid and reliable grading checklists have been published for the evaluation of performance during simulated high-stakes perioperative event management. As such, the purposes of this study were to construct valid scoring checklists for a variety of perioperative emergencies and to determine the reliability of scores produced by these checklists during continuous video review.
METHODS - A group of anesthesiologists, intensivists, and educators created a set of simulation grading checklists for the assessment of the following scenarios: severe anaphylaxis, cerebrovascular accident, hyperkalemic arrest, malignant hyperthermia, and acute coronary syndrome. Checklist items were coded as critical or noncritical. Nonexpert raters evaluated 10 simulation videos in a random order, with each video being graded 4 times. A group of faculty experts also graded the videos to create a reference standard to which nonexpert ratings were compared. P < 0.05 was considered significant.
RESULTS - Team leaders in the simulation videos were scored by the expert panel as having performed 56.5% of all items on the checklist (range, 43.8%-84.0%), and 67.2% of the critical items (range, 30.0%-100%). Nonexpert raters agreed with the expert assessment 89.6% of the time (95% confidence interval, 87.2%-91.6%). No learning curve development was found with repetitive video assessment or checklist use. The κ values comparing nonexpert rater assessments to the reference standard averaged 0.76 (95% confidence interval, 0.71-0.81).
CONCLUSIONS - The findings indicate that the grading checklists described are valid, are reliable, and could be used in perioperative crisis management assessment.