EL MORABIT S.
Defense: April 11, 2023 at 2:00 p.m.
PhD thesis in Electronics, Microelectronics, Nanoelectronics and Microwaves, Université Polytechnique Hauts-de-France, ED PHF, 11 April 2023
Summary:
The ability to feel pain is crucial to life, as it serves as an early warning system for potential damage to the body. The majority of pain assessments are based on patient reports. However, patients who are unable to express their pain must rely instead on third-party reports of their suffering. Because of potential observer bias, pain reports may contain inaccuracies. In addition, it would be impossible to monitor patients 24 hours a day. In order to better manage pain, particularly in patients with communication difficulties, automatic pain detection techniques could be implemented to help carers and complement their service. Facial expressions are used by most observation-based pain assessment systems, as they are a reliable indicator of pain and can be interpreted remotely. Considering that pain usually generates spontaneous facial behaviour, facial expressions could be used to detect the presence of pain. In this thesis, we analyse facial expressions of pain in order to address pain estimation. First, we present an in-depth analysis of the problem by comparing many popular CNN (convolutional neural network) architectures, such as MobileNet, GoogleNet, ResNeXt-50, ResNet18 and DenseNet-161. We use these networks in two unique modes: standalone and feature extraction. In autonomous mode, models (i.e. networks) are used to estimate pain directly.
Abstract:
The ability to feel pain is crucial to life, as it serves as an early warning system for potential damage to the body. The majority of pain assessments rely on patient reports. In contrast, patients who are unable to express their pain must rely instead on third-party reports of their suffering. Because of potential observer bias, pain reports may contain inaccuracies. In addition, it would be impossible to monitor patients 24 hours a day. To better manage pain, especially in patients with communication difficulties, automatic pain detection techniques could be implemented to assist caregivers and complement their service. Facial expressions are used by most observation-based pain assessment systems because they are a reliable indicator of pain and can be interpreted remotely. Considering that pain usually generates spontaneous facial behavior, facial expressions could be used to detect the presence of pain. In this thesis, we analyze facial expressions of pain to address pain estimation. First, we present a thorough analysis of the problem by comparing many common CNN (convolutional neural network) architectures, such as MobileNet, GoogleNet, ResNeXt-50, ResNet18, and DenseNet-161. We use these networks in two unique modes: autonomous and feature extraction. In autonomous mode, the models (i.e., networks) are used to directly estimate pain.