-
- Denys Fontaine, Valentin Vielzeuf, Philippe Genestier, Pascal Limeux, Serena Santucci-Sivilotto, Emmanuel Mory, Nelly Darmon, Michel Lanteri-Minet, May Mokhtar, Melanie Laine, Damien Vistoli, and DEFI study group.
- Department of Neurosurgery, Centre Hospitalier Universitaire de Nice, Nice, France.
- Eur J Pain. 2022 Jul 1; 26 (6): 1282-1291.
BackgroundPain intensity evaluation by self-report is difficult and biased in non-communicating people, which may contribute to inappropriate pain management. The use of artificial intelligence (AI) to evaluate pain intensity based on automated facial expression analysis has not been evaluated in clinical conditions.MethodsWe trained and externally validated a deep-learning system (ResNet-18 convolutional neural network) to identify and classify 2810 facial expressions of 1189 patients, captured before and after surgery, according to their self-reported pain intensity using numeric rating scale (NRS, 0-10). AI performances were evaluated by accuracy (concordance between AI prediction and patient-reported pain intensity), sensitivity and specificity to diagnose pain ≥4/10 and ≥7/10. We then confronted AI performances with those of 33 nurses to evaluate pain intensity from facial expression in the same situation.ResultsIn the external testing set (120 face images), the deep learning system was able to predict exactly the pain intensity among the 11 possible scores (0-10) in 53% of the cases with a mean error of 2.4 points. Its sensitivities to detect pain ≥4/10 and ≥7/10 were 89.7% and 77.5%, respectively. Nurses estimated the right NRS pain intensity with a mean accuracy of 14.9% and identified pain ≥4/10 and ≥7/10 with sensitivities of 44.9% and 17.0%.ConclusionsSubject to further improvement of AI performances through further training, these results suggest that AI using facial expression analysis could be used to assist physicians to evaluate pain and detect severe pain, especially in people not able to report appropriately their pain by themselves.SignificanceThese original findings represent a major step in the development of a fully automated, rapid, standardized and objective method based on facial expression analysis to measure pain and detect severe pain.© 2022 European Pain Federation - EFIC®.
Notes
Knowledge, pearl, summary or comment to share?You can also include formatting, links, images and footnotes in your notes
- Simple formatting can be added to notes, such as
*italics*
,_underline_
or**bold**
. - Superscript can be denoted by
<sup>text</sup>
and subscript<sub>text</sub>
. - Numbered or bulleted lists can be created using either numbered lines
1. 2. 3.
, hyphens-
or asterisks*
. - Links can be included with:
[my link to pubmed](http://pubmed.com)
- Images can be included with:
![alt text](https://bestmedicaljournal.com/study_graph.jpg "Image Title Text")
- For footnotes use
[^1](This is a footnote.)
inline. - Or use an inline reference
[^1]
to refer to a longer footnote elseweher in the document[^1]: This is a long footnote.
.