-
Zhonghua Shao Shang Za Zhi · Nov 2020
[Establishment and test results of an artificial intelligence burn depth recognition model based on convolutional neural network].
- Z Y He, Y Wang, P H Zhang, K Zuo, P F Liang, J Z Zeng, S T Zhou, L Guo, M T Huang, and X Cui.
- Department of Burns and Plastic Surgery, Xiangya Hospital, Central South University, Changsha 410008, China.
- Zhonghua Shao Shang Za Zhi. 2020 Nov 20; 36 (11): 1070-1074.
AbstractObjective: To establish an artificial intelligence burn depth recognition model based on convolutional neural network, and to test its effectiveness. Methods: In this evaluation study on diagnostic test, 484 wound photos of 221 burn patients in Xiangya Hospital of Central South University (hereinafter referred to as the author's unit) from January 2010 to December 2019 taken within 48 hours after injury which met the inclusion criteria were collected and numbered randomly. The target wounds were delineated by image viewing software, and the burn depth was judged by 3 attending doctors with more than 5-year professional experience in Department of Burns and Plastic Surgery of the author's unit. After marking the superficial partial-thickness burn, deep partial-thickness burn, or full-thickness burn in different colors, the burn wounds were cut according to 224×224 pixels to obtain 5 637 complete wound images. The image data generator was used to expand images of each burn depth to 10 000 images, after which, images of each burn depth were divided into training set, verification set, and test set according to the ratio of 7.0∶1.5∶1.5. Under Keras 2.2.4 Python 2.8.0 version, the residual network ResNet-50 of convolutional neural network was used to establish the artificial intelligence burn depth recognition model. The training set was input for training, and the verification set was used to adjust and optimize the model. The judging accuracy rate of various burn depths by the established model was tested by the test set, and precision, recall, and F1_score were calculated. The test results were visualized to generate two-dimensional tSNE cloud chart through the dimensionality reduction tool tSNE, and the distribution of various burn depths was observed. According to the sensitivity and specificity of the model for the recognition of 3 kinds of burn depths, the corresponding receiver operator characteristics (ROC) curve was drawn, and the area under the ROC curve was calculated. Results: (1) After the testing of the test set, the precisions of the artificial intelligence burn depth recognition model for the recognition of superficial partial-thickness burn, deep partial-thickness burn, or full-thickness burn were 84% (1 095/1 301), 81% (1 215/1 499) and 82% (1 395/1 700) respectively, the recall were 73% (1 095/1 500), 81% (1 215/1 500) and 93% (1 395/1 500) respectively, and the F1_scores were 0.78, 0.81, and 0.87 respectively. (2) tSNE cloud chart showed that there was small overlapping among different burn depths in the test results for the test set of artificial intelligence burn depth recognition model, among which the overlapping between superficial partial-thickness burn and deep partial-thickness burn and that between deep partial-thickness burn and full-thickness burn were relatively more, while the overlapping between superficial partial-thickness burn and full-thickness burn was relatively less. (3) The area under the ROC curve for 3 kinds of burn depths recognized by the artificial intelligence burn depth recognition model was ≥0.94. Conclusions: The artificial intelligence burn depth recognition model established by ResNet-50 network can rather accurately identify the burn depth in the early wound photos of burn patients, especially superficial partial-thickness burn and full-thickness burn. It is expected to be used clinically to assist the diagnosis of burn depth and improve the diagnostic accuracy.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.