• European radiology · Mar 2020

    Deep learning for fully automated tumor segmentation and extraction of magnetic resonance radiomics features in cervical cancer.

    • Yu-Chun Lin, Chia-Hung Lin, Hsin-Ying Lu, Hsin-Ju Chiang, Ho-Kai Wang, Yu-Ting Huang, Shu-Hang Ng, Ji-Hong Hong, Tzu-Chen Yen, Chyong-Huey Lai, and Gigin Lin.
    • Department of Medical Imaging and Intervention, Chang Gung Memorial Hospital at Linkou, 5 Fuhsing St., Guishan, Taoyuan, Taiwan, 33382.
    • Eur Radiol. 2020 Mar 1; 30 (3): 1297-1305.

    ObjectiveTo develop and evaluate the performance of U-Net for fully automated localization and segmentation of cervical tumors in magnetic resonance (MR) images and the robustness of extracting apparent diffusion coefficient (ADC) radiomics features.MethodsThis retrospective study involved analysis of MR images from 169 patients with cervical cancer stage IB-IVA captured; among them, diffusion-weighted (DW) images from 144 patients were used for training, and another 25 patients were recruited for testing. A U-Net convolutional network was developed to perform automated tumor segmentation. The manually delineated tumor region was used as the ground truth for comparison. Segmentation performance was assessed for various combinations of input sources for training. ADC radiomics were extracted and assessed using Pearson correlation. The reproducibility of the training was also assessed.ResultsCombining b0, b1000, and ADC images as a triple-channel input exhibited the highest learning efficacy in the training phase and had the highest accuracy in the testing dataset, with a dice coefficient of 0.82, sensitivity 0.89, and a positive predicted value 0.92. The first-order ADC radiomics parameters were significantly correlated between the manually contoured and fully automated segmentation methods (p < 0.05). Reproducibility between the first and second training iterations was high for the first-order radiomics parameters (intraclass correlation coefficient = 0.70-0.99).ConclusionU-Net-based deep learning can perform accurate localization and segmentation of cervical cancer in DW MR images. First-order radiomics features extracted from whole tumor volume demonstrate the potential robustness for longitudinal monitoring of tumor responses in broad clinical settings. U-Net-based deep learning can perform accurate localization and segmentation of cervical cancer in DW MR images.Key Points• U-Net-based deep learning can perform accurate fully automated localization and segmentation of cervical cancer in diffusion-weighted MR images. • Combining b0, b1000, and apparent diffusion coefficient (ADC) images exhibited the highest accuracy in fully automated localization. • First-order radiomics feature extraction from whole tumor volume was robust and could thus potentially be used for longitudinal monitoring of treatment responses.

      Pubmed     Full text   Copy Citation     Plaintext  

      Add institutional full text...

    Notes

     
    Knowledge, pearl, summary or comment to share?
    300 characters remaining
    help        
    You can also include formatting, links, images and footnotes in your notes
    • Simple formatting can be added to notes, such as *italics*, _underline_ or **bold**.
    • Superscript can be denoted by <sup>text</sup> and subscript <sub>text</sub>.
    • Numbered or bulleted lists can be created using either numbered lines 1. 2. 3., hyphens - or asterisks *.
    • Links can be included with: [my link to pubmed](http://pubmed.com)
    • Images can be included with: ![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
    • For footnotes use [^1](This is a footnote.) inline.
    • Or use an inline reference [^1] to refer to a longer footnote elseweher in the document [^1]: This is a long footnote..

    hide…