• Spine · Apr 2023

    Observational Study

    External Validation of SpineNet, an Open-source Deep Learning Model for Grading Lumbar Disc Degeneration MRI Features, Using the Northern Finland Birth Cohort 1966.

    • Terence P McSweeney, Aleksei Tiulpin, Simo Saarakkala, Jaakko Niinimäki, Rhydian Windsor, Amir Jamaludin, Timor Kadir, Jaro Karppinen, and Juhani Määttä.
    • Research Unit of Health Sciences and Technology, University of Oulu.
    • Spine. 2023 Apr 1; 48 (7): 484491484-491.

    Study DesignThis is a retrospective observational study to externally validate a deep learning image classification model.ObjectiveDeep learning models such as SpineNet offer the possibility of automating the process of disk degeneration (DD) classification from magnetic resonance imaging (MRI). External validation is an essential step to their development. The aim of this study was to externally validate SpineNet predictions for DD using Pfirrmann classification and Modic changes (MCs) on data from the Northern Finland Birth Cohort 1966 (NFBC1966).Summary Of DataWe validated SpineNet using data from 1331 NFBC1966 participants for whom both lumbar spine MRI data and consensus DD gradings were available.Materials And MethodsSpineNet returned Pfirrmann grade and MC presence from T2-weighted sagittal lumbar MRI sequences from NFBC1966, a data set geographically and temporally separated from its training data set. A range of agreement and reliability metrics were used to compare predictions with expert radiologists. Subsets of data that match SpineNet training data more closely were also tested.ResultsBalanced accuracy for DD was 78% (77%-79%) and for MC 86% (85%-86%). Interrater reliability for Pfirrmann grading was Lin concordance correlation coefficient=0.86 (0.85-0.87) and Cohen κ=0.68 (0.67-0.69). In a low back pain subset, these reliability metrics remained largely unchanged. In total, 20.83% of disks were rated differently by SpineNet compared with the human raters, but only 0.85% of disks had a grade difference >1. Interrater reliability for MC detection was κ=0.74 (0.72-0.75). In the low back pain subset, this metric was almost unchanged at κ=0.76 (0.73-0.79).ConclusionsIn this study, SpineNet has been benchmarked against expert human raters in the research setting. It has matched human reliability and demonstrates robust performance despite the multiple challenges facing model generalizability.Copyright © 2023 The Author(s). Published by Wolters Kluwer Health, Inc.

      Pubmed     Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.