• J Thorac Oncol · Jan 2011

    Review

    Measures of interrater agreement.

    • Jayawant N Mandrekar.
    • Department of Health Sciences Research, Mayo Clinic, Rochester, Minnesota 55905, USA. mandrekar.jay@mayo.edu
    • J Thorac Oncol. 2011 Jan 1;6(1):6-7.

    AbstractKappa statistics is used for the assessment of agreement between two or more raters when the measurement scale is categorical. In this short summary, we discuss and interpret the key features of the kappa statistics, the impact of prevalence on the kappa statistics, and its utility in clinical research. We also introduce the weighted kappa when the outcome is ordinal and the intraclass correlation to assess agreement in an event the data are measured on a continuous scale.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…