• Journal of critical care · Aug 2024

    Multicenter Study Comparative Study

    Assessment of the comparative agreement between chest radiographs and CT scans in intensive care units.

    • Daniel Brooks, Stephen E Wright, Anna Beattie, Nadia McAllister, Niall H Anderson, Alistair I Roy, Philip Gonsalves, Bryan Yates, Sara Graziadio, Alasdair Mackie, John Davidson, Sandeep Vijaya Gopal, Robert Whittle, Asef Zahed, Lorna Barton, Mathew Elameer, John Tuckett, Rob Holmes, Alexandra Sutcliffe, Nuria Santamaria, Luke la Hausse de Lalouviere, Sanjay Gupta, Jeevan Subramaniam, Janaki A Pearson, Matthew Brandwood, Richard Burnham, Anthony J Rostron, and A John Simpson.
    • Translational and Clinical Research Institute, Newcastle University, Newcastle Upon Tyne NE2 4HH, UK; Emergency Department, John Hunter Hospital, New Lambton Heights, NSW 2305, Australia.
    • J Crit Care. 2024 Aug 1; 82: 154760154760.

    PurposeChest radiographs in critically ill patients can be difficult to interpret due to technical and clinical factors. We sought to determine the agreement of chest radiographs and CT scans, and the inter-observer variation of chest radiograph interpretation, in intensive care units (ICUs).MethodsChest radiographs and corresponding thoracic computerised tomography (CT) scans (as reference standard) were collected from 45 ICU patients. All radiographs were analysed by 20 doctors (radiology consultants, radiology trainees, ICU consultants, ICU trainees) from 4 different centres, blinded to CT results. Specificity/sensitivity were determined for pleural effusion, lobar collapse and consolidation/atelectasis. Separately, Fleiss' kappa for multiple raters was used to determine inter-observer variation for chest radiographs.ResultsThe median sensitivity and specificity of chest radiographs for detecting abnormalities seen on CTs scans were 43.2% and 85.9% respectively. Diagnostic sensitivity for pleural effusion was significantly higher among radiology consultants but no specialty/experience distinctions were observed for specificity. Median inter-observer kappa coefficient among assessors was 0.295 ("fair").ConclusionsChest radiographs commonly miss important radiological features in critically ill patients. Inter-observer agreement in chest radiograph interpretation is only "fair". Consultant radiologists are least likely to miss thoracic radiological abnormalities. The consequences of misdiagnosis by chest radiographs remain to be determined.Copyright © 2024. Published by Elsevier Inc.

      Pubmed     Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.