• Spine · Feb 2005

    Interobserver and intraobserver reliability in the load sharing classification of the assessment of thoracolumbar burst fractures.

    • Li-Yang Dai and Wen-Jie Jin.
    • Department of Orthopaedic Surgery, Xinhua Hospital, Shanghai Second Medical University, Shanghai, China. lydai@etang.com
    • Spine. 2005 Feb 1;30(3):354-8.

    Study DesignThe Load Sharing Classification of spinal fractures was evaluated by 5 observers on 2 occasions.ObjectiveTo evaluate the interobserver and intraobserver reliability of the Load Sharing Classification of spinal fractures in the assessment of thoracolumbar burst fractures.Summary Of Background DataThe Load Sharing Classification of spinal fractures provides a basis for the choice of operative approaches, but the reliability of this classification system has not been established.MethodsThe radiographic and computed tomography scan images of 45 consecutive patients with thoracolumbar burst fractures were reviewed by 5 observers on 2 different occasions 3 months apart. Interobserver reliability was assessed by comparison of the fracture classifications determined by the 5 observers. Intraobserver reliability was evaluated by comparison of the classifications determined by each observer on the first and second sessions. Ten paired interobserver and 5 intraobserver comparisons were then analyzed with use of kappa statistics.ResultsAll 5 observers agreed on the final classification for 58% and 73% of the fractures on the first and second assessments, respectively. The average kappa coefficient for the 10 paired comparisons among the 5 observers was 0.79 (range 0.73-0.89) for the first assessment and 0.84 (range 0.81-0.95) for the second assessment. Interobserver agreement improved when the 3 components of the classification system were analyzed separately, reaching an almost perfect interobserver reliability with the average kappa values of 0.90 (range 0.82-0.97) for the first assessment and 0.92 (range 0.83-1) for the second assessment. The kappa values for the 5 intraobserver comparisons ranged from 0.73 to 0.87 (average 0.78), expressing at least substantial agreement; 2 observers showed almost perfect intraobserver reliability. For the 3 components of the classification system, all observers reached almost perfect intraobserver agreement with the kappa values of 0.83 to 0.97 (average, 0.89).ConclusionsKappa statistics showed high levels of agreement when the Load Sharing Classification was used to assess thoracolumbar burst fractures. This system can be applied with excellent reliability.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…