{"id":57310,"date":"2023-04-07T10:18:33","date_gmt":"2023-04-07T08:18:33","guid":{"rendered":"https:\/\/www.iemn.fr\/?p=57310"},"modified":"2023-04-07T10:48:40","modified_gmt":"2023-04-07T08:48:40","slug":"these-el-morabit-s-nouvelles-techniques-de-lintelligence-artificielle-pour-le-diagnostic-medical-base-sur-la-vision-par-ordinateur","status":"publish","type":"post","link":"https:\/\/www.iemn.fr\/en\/these-2023\/these-el-morabit-s-nouvelles-techniques-de-lintelligence-artificielle-pour-le-diagnostic-medical-base-sur-la-vision-par-ordinateur.html","title":{"rendered":"THESIS EL MORABIT S.: \"New artificial intelligence techniques for medical diagnosis based on computer vision\"."},"content":{"rendered":"<div id='layer_slider_1'  class='avia-layerslider main_color avia-shadow  avia-builder-el-0  el_before_av_heading  avia-builder-el-first  container_wrap sidebar_right'  style='height: 261px;'  ><div id=\"layerslider_58_nkqha0tnj2wb\" data-ls-slug=\"homepageslider\" class=\"ls-wp-container fitvidsignore ls-selectable\" style=\"width:1140px;height:260px;margin:0 auto;margin-bottom: 0px;\"><div class=\"ls-slide\" data-ls=\"duration:6000;transition2d:5;\"><img loading=\"lazy\" decoding=\"async\" width=\"2600\" height=\"270\" src=\"https:\/\/www.iemn.fr\/wp-content\/uploads\/2019\/01\/sliders_news1.jpg\" class=\"ls-bg\" alt=\"\" srcset=\"https:\/\/www.iemn.fr\/wp-content\/uploads\/2019\/01\/sliders_news1.jpg 2600w, https:\/\/www.iemn.fr\/wp-content\/uploads\/2019\/01\/sliders_news1-300x31.jpg 300w, https:\/\/www.iemn.fr\/wp-content\/uploads\/2019\/01\/sliders_news1-768x80.jpg 768w, https:\/\/www.iemn.fr\/wp-content\/uploads\/2019\/01\/sliders_news1-1030x107.jpg 1030w, https:\/\/www.iemn.fr\/wp-content\/uploads\/2019\/01\/sliders_news1-1500x156.jpg 1500w, https:\/\/www.iemn.fr\/wp-content\/uploads\/2019\/01\/sliders_news1-705x73.jpg 705w\" sizes=\"auto, (max-width: 2600px) 100vw, 2600px\" \/><ls-layer style=\"font-size:14px;text-align:left;font-style:normal;text-decoration:none;text-transform:none;font-weight:700;letter-spacing:0px;border-style:solid;border-color:#000;background-position:0% 0%;background-repeat:no-repeat;width:180px;height:30px;left:0px;top:231px;line-height:32px;color:#ffffff;border-radius:6px 6px 6px 6px;padding-left:50px;background-color:rgba(0, 0, 0, 0.57);\" class=\"ls-l ls-ib-icon ls-text-layer\" data-ls=\"minfontsize:0;minmobilefontsize:0;\"><i class=\"fa fa-quote-right\" style=\"color:#ffffff;margin-right:0.8em;font-size:1em;transform:translateY( -0.125em );\"><\/i>ACTUALITES<\/ls-layer><\/div><\/div><\/div><div id='after_layer_slider_1'  class='main_color av_default_container_wrap container_wrap sidebar_right'  ><div class='container av-section-cont-open' ><div class='template-page content  av-content-small alpha units'><div class='post-entry post-entry-type-page post-entry-57310'><div class='entry-content-wrapper clearfix'>\n\n<style type=\"text\/css\" data-created_by=\"avia_inline_auto\" id=\"style-css-av-av_heading-ded5a7795d504fda3937af55f0ac4f74\">\n#top .av-special-heading.av-av_heading-ded5a7795d504fda3937af55f0ac4f74{\nmargin:0 0 10px 0;\npadding-bottom:4px;\n}\nbody .av-special-heading.av-av_heading-ded5a7795d504fda3937af55f0ac4f74 .av-special-heading-tag .heading-char{\nfont-size:25px;\n}\n.av-special-heading.av-av_heading-ded5a7795d504fda3937af55f0ac4f74 .av-subheading{\nfont-size:15px;\n}\n<\/style>\n<div  class='av-special-heading av-av_heading-ded5a7795d504fda3937af55f0ac4f74 av-special-heading-h2  avia-builder-el-1  el_after_av_layerslider  el_before_av_hr  avia-builder-el-first'><h2 class='av-special-heading-tag'  itemprop=\"headline\"  >THESE : EL MORABIT S. : \u00ab\u00a0Nouvelles techniques de l\u2019intelligence artificielle pour le diagnostic m\u00e9dical bas\u00e9 sur la vision par ordinateur \u00ab\u00a0<\/h2><div class=\"special-heading-border\"><div class=\"special-heading-inner-border\"><\/div><\/div><\/div>\n\n<style type=\"text\/css\" data-created_by=\"avia_inline_auto\" id=\"style-css-av-18u73nj-dad6a947580930e400fc42ba200e80f1\">\n#top .hr.av-18u73nj-dad6a947580930e400fc42ba200e80f1{\nmargin-top:5px;\nmargin-bottom:5px;\n}\n.hr.av-18u73nj-dad6a947580930e400fc42ba200e80f1 .hr-inner{\nwidth:100%;\n}\n<\/style>\n<div  class='hr av-18u73nj-dad6a947580930e400fc42ba200e80f1 hr-custom  avia-builder-el-2  el_after_av_heading  el_before_av_textblock  hr-left hr-icon-no'><span class='hr-inner inner-border-av-border-thin'><span class=\"hr-inner-style\"><\/span><\/span><\/div>\n<section  class='av_textblock_section av-jriy64i8-2f4600354c0449b610997916bbd9b6bc'   itemscope=\"itemscope\" itemtype=\"https:\/\/schema.org\/BlogPosting\" itemprop=\"blogPost\" ><div class='avia_textblock'  itemprop=\"text\" >\n<style type=\"text\/css\" data-created_by=\"avia_inline_auto\" id=\"style-css-av-13ewzjw-68e036126b913e5028f77311dc66b825\">\n.av_font_icon.av-13ewzjw-68e036126b913e5028f77311dc66b825{\ncolor:#bfbfbf;\nborder-color:#bfbfbf;\n}\n.av_font_icon.av-13ewzjw-68e036126b913e5028f77311dc66b825 .av-icon-char{\nfont-size:60px;\nline-height:60px;\n}\n<\/style>\n<span  class='av_font_icon av-13ewzjw-68e036126b913e5028f77311dc66b825 avia_animate_when_visible av-icon-style- avia-icon-pos-left avia-icon-animate'><span class='av-icon-char' aria-hidden='true' data-av_icon='\ue8c9' data-av_iconfont='entypo-fontello' ><\/span><\/span>\n<p><strong><u>EL MORABIT S.<\/u> <\/strong><\/p>\n<p>Defense: April 11, 2023 at 2:00 p.m.<strong><br \/>\n<\/strong>PhD thesis in Electronics, Microelectronics, Nanoelectronics and Microwaves, Universit\u00e9 Polytechnique Hauts-de-France, ED PHF, 11 April 2023<\/p>\n<\/div><\/section>\n<section  class='av_textblock_section av-jtefqx33-628129dba2299b2ecd65ebfc92eac29d'   itemscope=\"itemscope\" itemtype=\"https:\/\/schema.org\/BlogPosting\" itemprop=\"blogPost\" ><div class='avia_textblock'  itemprop=\"text\" ><div  class='hr av-kjh3zw-4dff888f744b728a1aca9b3a0971493a hr-default  avia-builder-el-6  avia-builder-el-no-sibling'><span class='hr-inner'><span class=\"hr-inner-style\"><\/span><\/span><\/div>\n<h5>Summary:<\/h5>\n<p>The ability to feel pain is crucial to life, as it serves as an early warning system for potential damage to the body. The majority of pain assessments are based on patient reports. However, patients who are unable to express their pain must rely instead on third-party reports of their suffering. Because of potential observer bias, pain reports may contain inaccuracies. In addition, it would be impossible to monitor patients 24 hours a day. In order to better manage pain, particularly in patients with communication difficulties, automatic pain detection techniques could be implemented to help carers and complement their service. Facial expressions are used by most observation-based pain assessment systems, as they are a reliable indicator of pain and can be interpreted remotely. Considering that pain usually generates spontaneous facial behaviour, facial expressions could be used to detect the presence of pain. In this thesis, we analyse facial expressions of pain in order to address pain estimation. First, we present an in-depth analysis of the problem by comparing many popular CNN (convolutional neural network) architectures, such as MobileNet, GoogleNet, ResNeXt-50, ResNet18 and DenseNet-161. We use these networks in two unique modes: standalone and feature extraction. In autonomous mode, models (i.e. networks) are used to estimate pain directly.<\/p>\n<h5>Abstract:<\/h5>\n<p>The ability to feel pain is crucial to life, as it serves as an early warning system for potential damage to the body. The majority of pain assessments rely on patient reports. In contrast, patients who are unable to express their pain must rely instead on third-party reports of their suffering. Because of potential observer bias, pain reports may contain inaccuracies. In addition, it would be impossible to monitor patients 24 hours a day. To better manage pain, especially in patients with communication difficulties, automatic pain detection techniques could be implemented to assist caregivers and complement their service. Facial expressions are used by most observation-based pain assessment systems because they are a reliable indicator of pain and can be interpreted remotely. Considering that pain usually generates spontaneous facial behavior, facial expressions could be used to detect the presence of pain. In this thesis, we analyze facial expressions of pain to address pain estimation. First, we present a thorough analysis of the problem by comparing many common CNN (convolutional neural network) architectures, such as MobileNet, GoogleNet, ResNeXt-50, ResNet18, and DenseNet-161. We use these networks in two unique modes: autonomous and feature extraction. In autonomous mode, the models (i.e., networks) are used to directly estimate pain.<\/p>\n<\/div><\/section>","protected":false},"excerpt":{"rendered":"","protected":false},"author":20,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[318],"tags":[],"class_list":["post-57310","post","type-post","status-publish","format-standard","hentry","category-these-2023"],"_links":{"self":[{"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/posts\/57310","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/users\/20"}],"replies":[{"embeddable":true,"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/comments?post=57310"}],"version-history":[{"count":0,"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/posts\/57310\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/media?parent=57310"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/categories?post=57310"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.iemn.fr\/en\/wp-json\/wp\/v2\/tags?post=57310"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}