• Chinese medical journal · Dec 2024

    ResNet-Vision Transformer based MRI-endoscopy fusion model for predicting treatment response to neoadjuvant chemoradiotherapy in locally advanced rectal cancer: A multicenter study.

    • Junhao Zhang, Ruiqing Liu, Di Hao, Guangye Tian, Shiwei Zhang, Sen Zhang, Yitong Zang, Kai Pang, Xuhua Hu, Keyu Ren, Mingjuan Cui, Shuhao Liu, Jinhui Wu, Quan Wang, Bo Feng, Weidong Tong, Yingchi Yang, Guiying Wang, and Yun Lu.
    • Department of Gastrointestinal Surgery, The Affiliated Hospital of Qingdao University, Qingdao, Shandong 266000, China.
    • Chin. Med. J. 2024 Dec 10.

    BackgroundNeoadjuvant chemoradiotherapy followed by radical surgery has been a common practice for patients with locally advanced rectal cancer, but the response rate varies among patients. This study aimed to develop a ResNet-Vision Transformer based magnetic resonance imaging (MRI)-endoscopy fusion model to precisely predict treatment response and provide personalized treatment.MethodsIn this multicenter study, 366 eligible patients who had undergone neoadjuvant chemoradiotherapy followed by radical surgery at eight Chinese tertiary hospitals between January 2017 and June 2024 were recruited, with 2928 pretreatment colonic endoscopic images and 366 pelvic MRI images. An MRI-endoscopy fusion model was constructed based on the ResNet backbone and Transformer network using pretreatment MRI and endoscopic images. Treatment response was defined as good response or non-good response based on the tumor regression grade. The Delong test and the Hanley-McNeil test were utilized to compare prediction performance among different models and different subgroups, respectively. The predictive performance of the MRI-endoscopy fusion model was comprehensively validated in the test sets and was further compared to that of the single-modal MRI model and single-modal endoscopy model.ResultsThe MRI-endoscopy fusion model demonstrated favorable prediction performance. In the internal validation set, the area under the curve (AUC) and accuracy were 0.852 (95% confidence interval [CI]: 0.744-0.940) and 0.737 (95% CI: 0.712-0.844), respectively. Moreover, the AUC and accuracy reached 0.769 (95% CI: 0.678-0.861) and 0.729 (95% CI: 0.628-0.821), respectively, in the external test set. In addition, the MRI-endoscopy fusion model outperformed the single-modal MRI model (AUC: 0.692 [95% CI: 0.609-0.783], accuracy: 0.659 [95% CI: 0.565-0.775]) and the single-modal endoscopy model (AUC: 0.720 [95% CI: 0.617-0.823], accuracy: 0.713 [95% CI: 0.612-0.809]) in the external test set.ConclusionThe MRI-endoscopy fusion model based on ResNet-Vision Transformer achieved favorable performance in predicting treatment response to neoadjuvant chemoradiotherapy and holds tremendous potential for enabling personalized treatment regimens for locally advanced rectal cancer patients.Copyright © 2024 The Chinese Medical Association, produced by Wolters Kluwer, Inc. under the CC-BY-NC-ND license.

      Pubmed     Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…

What will the 'Medical Journal of You' look like?

Start your free 21 day trial now.

We guarantee your privacy. Your email address will not be shared.