-
- Roy Phitayakorn, Rebecca Minehart, May C M Pian-Smith, Maureen W Hemingway, Tanya Milosh-Zinkus, Danika Oriol-Morway, and Emil Petrusa.
- Department of Surgery, Massachusetts General Hospital, Boston, Massachusetts; MGH Learning Laboratory, Massachusetts General Hospital, Boston, Massachusetts. Electronic address: rphitayakorn@mgh.harvard.edu.
- J. Surg. Res. 2014 Jul 1;190(1):22-8.
BackgroundHigh-quality teamwork among operating room (OR) professionals is a key to efficient and safe practice. Quantification of teamwork facilitates feedback, assessment, and improvement. Several valid and reliable instruments are available for assessing separate OR disciplines and teams. We sought to determine the most feasible approach for routine documentation of teamwork in in-situ OR simulations. We compared rater agreement, hypothetical training costs, and feasibility ratings from five clinicians and two nonclinicians with instruments for assessment of separate OR groups and teams.Materials And MethodsFive teams of anesthesia or surgery residents and OR nurses (RN) or surgical technicians were videotaped in simulations of an epigastric hernia repair where the patient develops malignant hyperthermia. Two anesthesiologists, one OR clinical RN specialist, one educational psychologist, one simulation specialist, and one general surgeon discussed and then independently completed Anesthesiologists' Non-Technical Skills, Non-Technical Skills for Surgeons, Scrub Practitioners' List of Intraoperative Non-Technical Skills, and Observational Teamwork Assessment for Surgery forms to rate nontechnical performance of anesthesiologists, surgeons, nurses, technicians, and the whole team.ResultsIntraclass correlations of agreement ranged from 0.17-0.85. Clinicians' agreements were not different from nonclinicians'. Published rater training was 4 h for Anesthesiologists' Non-Technical Skills and Scrub Practitioners' List of Intraoperative Non-Technical Skills, 2.5 h for Non-Technical Skills for Surgeons, and 15.5 h for Observational Teamwork Assessment for Surgery. Estimated costs to train one rater to use all instruments ranged from $442 for a simulation specialist to $6006 for a general surgeon.ConclusionsAdditional training is needed to achieve higher levels of agreement; however, costs may be prohibitive. The most cost-effective model for real-time OR teamwork assessment may be to use a simulation technician combined with one clinical rater to allow complete documentation of all participants.Copyright © 2014 Elsevier Inc. All rights reserved.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.