• The lancet oncology · Jan 2021

    Deep learning model for the prediction of microsatellite instability in colorectal cancer: a diagnostic study.

    • Rikiya Yamashita, Jin Long, Teri Longacre, Lan Peng, Gerald Berry, Brock Martin, John Higgins, Daniel L Rubin, and Jeanne Shen.
    • Department of Biomedical Data Science, Stanford University School of Medicine, Stanford, CA, USA; Center for Artificial Intelligence in Medicine and Imaging, Stanford University, Stanford, CA, USA.
    • Lancet Oncol. 2021 Jan 1; 22 (1): 132-141.

    BackgroundDetecting microsatellite instability (MSI) in colorectal cancer is crucial for clinical decision making, as it identifies patients with differential treatment response and prognosis. Universal MSI testing is recommended, but many patients remain untested. A critical need exists for broadly accessible, cost-efficient tools to aid patient selection for testing. Here, we investigate the potential of a deep learning-based system for automated MSI prediction directly from haematoxylin and eosin (H&E)-stained whole-slide images (WSIs).MethodsOur deep learning model (MSINet) was developed using 100 H&E-stained WSIs (50 with microsatellite stability [MSS] and 50 with MSI) scanned at 40× magnification, each from a patient randomly selected in a class-balanced manner from the pool of 343 patients who underwent primary colorectal cancer resection at Stanford University Medical Center (Stanford, CA, USA; internal dataset) between Jan 1, 2015, and Dec 31, 2017. We internally validated the model on a holdout test set (15 H&E-stained WSIs from 15 patients; seven cases with MSS and eight with MSI) and externally validated the model on 484 H&E-stained WSIs (402 cases with MSS and 77 with MSI; 479 patients) from The Cancer Genome Atlas, containing WSIs scanned at 40× and 20× magnification. Performance was primarily evaluated using the sensitivity, specificity, negative predictive value (NPV), and area under the receiver operating characteristic curve (AUROC). We compared the model's performance with that of five gastrointestinal pathologists on a class-balanced, randomly selected subset of 40× magnification WSIs from the external dataset (20 with MSS and 20 with MSI).FindingsThe MSINet model achieved an AUROC of 0·931 (95% CI 0·771-1·000) on the holdout test set from the internal dataset and 0·779 (0·720-0·838) on the external dataset. On the external dataset, using a sensitivity-weighted operating point, the model achieved an NPV of 93·7% (95% CI 90·3-96·2), sensitivity of 76·0% (64·8-85·1), and specificity of 66·6% (61·8-71·2). On the reader experiment (40 cases), the model achieved an AUROC of 0·865 (95% CI 0·735-0·995). The mean AUROC performance of the five pathologists was 0·605 (95% CI 0·453-0·757).InterpretationOur deep learning model exceeded the performance of experienced gastrointestinal pathologists at predicting MSI on H&E-stained WSIs. Within the current universal MSI testing paradigm, such a model might contribute value as an automated screening tool to triage patients for confirmatory testing, potentially reducing the number of tested patients, thereby resulting in substantial test-related labour and cost savings.FundingStanford Cancer Institute and Stanford Departments of Pathology and Biomedical Data Science.Copyright © 2021 Elsevier Ltd. All rights reserved.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

Want more great medical articles?

Keep up to date with a free trial of metajournal, personalized for your practice.
1,704,841 articles already indexed!

We guarantee your privacy. Your email address will not be shared.