• Simul Healthc · Oct 2014

    Validity and reliability assessment of detailed scoring checklists for use during perioperative emergency simulation training.

    • Matthew D McEvoy, William R Hand, Cory M Furse, Larry C Field, Carlee A Clark, Vivek K Moitra, Paul J Nietert, Michael F O'Connor, and Mark E Nunnally.
    • From the Department of Anesthesiology (M.D.M.), Vanderbilt University Medical Center, Nashville, TN; Departments of Anesthesia and Perioperative Medicine (W.R.H., C.M.F., L.C.F., C.A.C.), and Public Health Sciences (P.J.N.), Medical University of South Carolina, Charleston, SC; Department of Anesthesiology (V.K.M.), Columbia University Medical Center, New York, NY; and Section of Critical Care Medicine (M.F.O.) Department of Anesthesia and Critical Care (M.E.N.), University of Chicago, Chicago, IL.
    • Simul Healthc. 2014 Oct 1;9(5):295-303.

    IntroductionFew valid and reliable grading checklists have been published for the evaluation of performance during simulated high-stakes perioperative event management. As such, the purposes of this study were to construct valid scoring checklists for a variety of perioperative emergencies and to determine the reliability of scores produced by these checklists during continuous video review.MethodsA group of anesthesiologists, intensivists, and educators created a set of simulation grading checklists for the assessment of the following scenarios: severe anaphylaxis, cerebrovascular accident, hyperkalemic arrest, malignant hyperthermia, and acute coronary syndrome. Checklist items were coded as critical or noncritical. Nonexpert raters evaluated 10 simulation videos in a random order, with each video being graded 4 times. A group of faculty experts also graded the videos to create a reference standard to which nonexpert ratings were compared. P < 0.05 was considered significant.ResultsTeam leaders in the simulation videos were scored by the expert panel as having performed 56.5% of all items on the checklist (range, 43.8%-84.0%), and 67.2% of the critical items (range, 30.0%-100%). Nonexpert raters agreed with the expert assessment 89.6% of the time (95% confidence interval, 87.2%-91.6%). No learning curve development was found with repetitive video assessment or checklist use. The κ values comparing nonexpert rater assessments to the reference standard averaged 0.76 (95% confidence interval, 0.71-0.81).ConclusionsThe findings indicate that the grading checklists described are valid, are reliable, and could be used in perioperative crisis management assessment.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.