• Emerg Med J · Aug 2019

    Man versus machine: comparison of naked-eye estimation and quantified capillary refill.

    • Rani Toll John, Joakim Henricson, Chris D Anderson, and Björk Wilhelms Daniel D http://orcid.org/0000-0001-6347-3970 Department of Emergency Medicine, Local Health Care Services in Central Östergötland, Reg.
    • Division of Cell Biology, Department of Clinical and Experimental Medicine, Faculty of Health Sciences, Linköping University, Linköping, Östergötland, Sweden.
    • Emerg Med J. 2019 Aug 1; 36 (8): 465-471.

    BackgroundCapillary refill (CR) time is traditionally assessed by 'naked-eye' inspection of the return to original colour of a tissue after blanching pressure. Few studies have addressed intra-observer reliability or used objective quantification techniques to assess time to original colour. This study compares naked-eye assessment with quantified CR (qCR) time using polarisation spectroscopy and examines intra-observer and interobserver agreements in using the naked eye.MethodA film of 18 CR tests (shown in a random fixed order) performed in healthy adults was assessed by a convenience sample of 14 doctors, 15 nurses and 19 secretaries (Department of Emergency Medicine, Linköping University, September to November 2017), who were asked to estimate the time to return to colour and characterise it as 'fast', 'normal' or 'slow'. The qCR times and corresponding naked-eye time assessments were compared using the Kruskal-Wallis test. Three videos were shown twice without observers' knowledge to measure intra-observer repeatability. Intra-observer categorical assessments were compared using Cohen's Kappa analysis. Interobserver repeatability was measured and depicted with multiple-observer Bland-Altman plotting. Differences in naked-eye estimation between professions were analysed using ANOVA.ResultsNaked-eye assessed CR time and qCR time differ substantially, and agreement for the categorical assessments (naked-eye assessment vs qCR classification) was poor (Cohen's kappa 0.27). Bland-Altman intra-observer repeatability ranged from 6% to 60%. Interobserver agreement was low as shown by the Bland-Altman plotting with a 95% limit of agreement with the mean of ±1.98 s for doctors, ±1.6 s for nurses and ±1.75 s for secretaries. The difference in CR time estimation (in seconds) between professions was not significant.ConclusionsOur study suggests that naked-eye-assessed CR time shows poor reproducibility, even by the same observers, and differs from an objective measure of CR time.© Author(s) (or their employer(s)) 2019. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ.

      Pubmed     Free full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.