The analysis of facial movements in patients with amyotrophic lateral sclerosis (ALS) can provide important information about early diagnosis and tracking disease progression. However, the use of expensive motion tracking systems has limited the clinical utility of the assessment. In this study, we propose a marker-less video-based approach to discriminate patients with ALS from neurotypical subjects. Facial movements were recorded using a depth sensor (Intel® RealSense' SR300) during speech and nonspeech tasks. A small set of kinematic features of lips was extracted in order to mirror the perceptual evaluation performed by clinicians, considering the following aspects: (1) range of motion, (2) speed of motion, (3) symmetry, and (4) shape. Our results demonstrate that it is possible to distinguish patients with ALS from neurotypical subjects with high overall accuracy (up to 88.9%) during repetitions of sentences, syllables, and labial non-speech movements (e.g., lip spreading). This paper provides strong rationale for the development of automated systems to detect neurological diseases from facial movements. This work has a high social impact, as it opens new possibilities to develop intelligent systems to support clinicians in their diagnosis, introducing novel standards for assessing the oro-facial impairment in ALS, and tracking disease progression remotely from home.

Bandini A., Green J.R., Taati B., Orlandi S., Zinman L., Yunusova Y. (2018). Automatic detection of amyotrophic lateral sclerosis (ALS) from video-based analysis of facial movements: Speech and non-speech tasks. 345 E 47TH ST, NEW YORK, NY 10017 USA : Institute of Electrical and Electronics Engineers Inc. [10.1109/FG.2018.00031].

Automatic detection of amyotrophic lateral sclerosis (ALS) from video-based analysis of facial movements: Speech and non-speech tasks

Orlandi S.;
2018

Abstract

The analysis of facial movements in patients with amyotrophic lateral sclerosis (ALS) can provide important information about early diagnosis and tracking disease progression. However, the use of expensive motion tracking systems has limited the clinical utility of the assessment. In this study, we propose a marker-less video-based approach to discriminate patients with ALS from neurotypical subjects. Facial movements were recorded using a depth sensor (Intel® RealSense' SR300) during speech and nonspeech tasks. A small set of kinematic features of lips was extracted in order to mirror the perceptual evaluation performed by clinicians, considering the following aspects: (1) range of motion, (2) speed of motion, (3) symmetry, and (4) shape. Our results demonstrate that it is possible to distinguish patients with ALS from neurotypical subjects with high overall accuracy (up to 88.9%) during repetitions of sentences, syllables, and labial non-speech movements (e.g., lip spreading). This paper provides strong rationale for the development of automated systems to detect neurological diseases from facial movements. This work has a high social impact, as it opens new possibilities to develop intelligent systems to support clinicians in their diagnosis, introducing novel standards for assessing the oro-facial impairment in ALS, and tracking disease progression remotely from home.
2018
Proceedings - 13th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2018
150
157
Bandini A., Green J.R., Taati B., Orlandi S., Zinman L., Yunusova Y. (2018). Automatic detection of amyotrophic lateral sclerosis (ALS) from video-based analysis of facial movements: Speech and non-speech tasks. 345 E 47TH ST, NEW YORK, NY 10017 USA : Institute of Electrical and Electronics Engineers Inc. [10.1109/FG.2018.00031].
Bandini A.; Green J.R.; Taati B.; Orlandi S.; Zinman L.; Yunusova Y.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/876879
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 37
  • ???jsp.display-item.citation.isi??? 30
social impact