-
Investigative radiology · Jun 2021
Deep Learning-Based Automated Abdominal Organ Segmentation in the UK Biobank and German National Cohort Magnetic Resonance Imaging Studies.
- Turkay Kart, Marc Fischer, Thomas Küstner, Tobias Hepp, Fabian Bamberg, Stefan Winzeck, Ben Glocker, Daniel Rueckert, and Sergios Gatidis.
- From the Biomedical Image Analysis Group, Department of Computing, Imperial College London, London, UK.
- Invest Radiol. 2021 Jun 1; 56 (6): 401-408.
PurposeThe aims of this study were to train and evaluate deep learning models for automated segmentation of abdominal organs in whole-body magnetic resonance (MR) images from the UK Biobank (UKBB) and German National Cohort (GNC) MR imaging studies and to make these models available to the scientific community for analysis of these data sets.MethodsA total of 200 T1-weighted MR image data sets of healthy volunteers each from UKBB and GNC (400 data sets in total) were available in this study. Liver, spleen, left and right kidney, and pancreas were segmented manually on all 400 data sets, providing labeled ground truth data for training of a previously described U-Net-based deep learning framework for automated medical image segmentation (nnU-Net). The trained models were tested on all data sets using a 4-fold cross-validation scheme. Qualitative analysis of automated segmentation results was performed visually; performance metrics between automated and manual segmentation results were computed for quantitative analysis. In addition, interobserver segmentation variability between 2 human readers was assessed on a subset of the data.ResultsAutomated abdominal organ segmentation was performed with high qualitative and quantitative accuracy on UKBB and GNC data. In more than 90% of data sets, no or only minor visually detectable qualitative segmentation errors occurred. Mean Dice scores of automated segmentations compared with manual reference segmentations were well higher than 0.9 for the liver, spleen, and kidneys on UKBB and GNC data and around 0.82 and 0.89 for the pancreas on UKBB and GNC data, respectively. Mean average symmetric surface distance was between 0.3 and 1.5 mm for the liver, spleen, and kidneys and between 2 and 2.2 mm for pancreas segmentation. The quantitative accuracy of automated segmentation was comparable with the agreement between 2 human readers for all organs on UKBB and GNC data.ConclusionAutomated segmentation of abdominal organs is possible with high qualitative and quantitative accuracy on whole-body MR imaging data acquired as part of UKBB and GNC. The results obtained and deep learning models trained in this study can be used as a foundation for automated analysis of thousands of MR data sets of UKBB and GNC and thus contribute to tackling topical and original scientific questions.Copyright © 2021 Wolters Kluwer Health, Inc. All rights reserved.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.