-
Observational Study
Interrater reliability of qualitative ultrasound assessment of gastric content in the third trimester of pregnancy.
- C Arzola, J Cubillos, A Perlas, K Downey, and J C A Carvalho.
- Department of Anesthesia and Pain Management, Mount Sinai Hospital, University of Toronto, 600 University Avenue, Room 19-104, Toronto, ON, Canada M5G 1X5 carzola@mtsinai.on.ca.
- Br J Anaesth. 2014 Dec 1;113(6):1018-23.
BackgroundPulmonary aspiration of gastric contents in pregnant women undergoing general anaesthesia is one of the most feared complications in obstetric anaesthesia. Bedside gastric ultrasonography is a feasible imaging tool to assess the gastric content. The purpose of this study was to investigate the reliability of qualitative bedside assessment of the gastric content performed by anaesthesiologists on third trimester pregnant women.MethodsPregnant women (≥32 weeks gestational age) were randomized to undergo ultrasound (US) assessments of their stomach in a fasting state (>8 h), or after ingestion of clear fluids only, or solid food. Three anaesthesiologists trained in gastric ultrasonography performed the assessments using a low-frequency curved-array US transducer (5-2 MHz). Primary outcome of the study was the consistency of raters in diagnosing the correct status of the gastric content, which was used to determine the interrater reliability among the three anaesthesiologists. Secondary outcomes were overall proportion of correct and incorrect diagnoses and the specific proportions of correct diagnosis across the three gastric content groups.ResultsWe analysed 32 pregnant women. The interrater reliability displayed a kappa statistic of 0.74 (bias corrected 95% CI: 0.68-0.84). The overall proportion of correct diagnosis was 87.5% (84 of 96). The odds of correct diagnosis for 'solid contents' were 16.7 times the odds for 'empty', and 14.3 times for 'clear fluid'.ConclusionsOur results show the consistency of the qualitative US assessment of gastric contents of pregnant women in the third trimester by anaesthesiologists. A kappa of 0.74 suggests substantial agreement in terms of interrater reliability for this diagnostic measurement.Clinical Trial RegistrationClinicalTrials.gov identifier: NCT01564030.© The Author 2014. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.