• Crit Care Resusc · Mar 2016

    Validation of a classification system for causes of death in critical care: an assessment of inter-rater reliability.

    • Elliott Ridgeon, Rinaldo Bellomo, John Myburgh, Manoj Saxena, Mark Weatherall, Rahi Jahan, Dilshan Arawwawala, Stephanie Bell, Warwick Butt, Julie Camsooksai, Coralie Carle, Andrew Cheng, Emanuel Cirstea, Jeremy Cohen, Julius Cranshaw, Anthony Delaney, Glenn Eastwood, Suzanne Eliott, Uwe Franke, Dashiell Gantner, Cameron Green, Richard Howard-Griffin, Deborah Inskip, Edward Litton, Christopher MacIsaac, Amanda McCairn, Tushar Mahambrey, Parvez Moondi, Lynette Newby, Stephanie O'Connor, Claire Pegg, Alan Pope, Henrik Reschreiter, Brent Richards, Megan Robertson, Helen Rodgers, Yahya Shehabi, Ian Smith, Julie Smith, Neil Smith, Anna Tilsley, Christina Whitehead, Emma Willett, Katherine Wong, Claudia Woodford, Stephen Wright, and Paul Young.
    • Medical Research Institute of New Zealand, Wellington, New Zealand. paul.young@ccdhb.org.nz.
    • Crit Care Resusc. 2016 Mar 1;18(1):50-4.

    ObjectiveTrials in critical care have previously used unvalidated systems to classify cause of death. We aimed to provide initial validation of a method to classify cause of death in intensive care unit patients.Design, Setting And ParticipantsOne hundred case scenarios of patients who died in an ICU were presented online to raters, who were asked to select a proximate and an underlying cause of death for each, using the ICU Deaths Classification and Reason (ICU-DECLARE) system. We evaluated two methods of categorising proximate cause of death (designated Lists A and B) and one method of categorising underlying cause of death. Raters were ICU specialists and research coordinators from Australia, New Zealand and the United Kingdom.Main Outcome MeasuresInter-rater reliability, as measured by the Fleiss multirater kappa, and the median proportion of raters choosing the most likely diagnosis (defined as the most popular classification choice in each case).ResultsAcross all raters and cases, for proximate cause of death List A, kappa was 0.54 (95% CI, 0.49-0.60), and for proximate cause of death List B, kappa was 0.58 (95% CI, 0.53-0.63). For the underlying cause of death, kappa was 0.48 (95% CI, 0.44-0.53). The median proportion of raters choosing the most likely diagnosis for proximate cause of death, List A, was 77.5% (interquartile range [IQR], 60.0%-93.8%), and the median proportion choosing the most likely diagnosis for proximate cause of death, List B, was 82.5% (IQR, 60.0%-92.5%). The median proportion choosing the most likely diagnosis for underlying cause was 65.0% (IQR, 50.0%-81.3%). Kappa and median agreement were similar between countries. ICU specialists showed higher kappa and median agreement than research coordinators.ConclusionsThe ICU-DECLARE system allowed ICU doctors to classify the proximate cause of death of patients who died in the ICU with substantial reliability.

      Pubmed     Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…