To interact naturally and achieve mutual sympathy between humans and machines, emotion recognition is one of the most important function to realize advanced human-computer interaction devices. Due to the high correlation between emotion and involuntary physiological changes, physiological signals are a prime candidate for emotion analysis. However, due to the need of a huge amount of training data for a high-quality machine learning model, computational complexity becomes a major bottleneck. To overcome this issue, brain-inspired hyperdimensional (HD) computing, an energy-efficient and fast learning computational paradigm, has a high potential to achieve a balance between accuracy and the amount of necessary training data. We propose an HD Computing-based Multimodality Emotion Recognition (HDC-MER). HDCMER maps real-valued features to binary HD vectors using a random nonlinear function, and further encodes them over time, and fuses across different modalities including GSR, ECG, and EEG. The experimental results show that, compared to the best method using the full training data, HDC-MER achieves higher classification accuracy for both valence (83.2% vs. 80.1%) and arousal (70.1% vs. 68.4%) using only 1/4 training data. HDC-MER also achieves at least 5% higher averaged accuracy compared to all the other methods in any point along the learning curve.

Chang E.-J., Rahimi A., Benini L., Wu A.-Y.A. (2019). Hyperdimensional Computing-based Multimodality Emotion Recognition with Physiological Signals. New York : Institute of Electrical and Electronics Engineers Inc. [10.1109/AICAS.2019.8771622].

Hyperdimensional Computing-based Multimodality Emotion Recognition with Physiological Signals

Benini L.;
2019

Abstract

To interact naturally and achieve mutual sympathy between humans and machines, emotion recognition is one of the most important function to realize advanced human-computer interaction devices. Due to the high correlation between emotion and involuntary physiological changes, physiological signals are a prime candidate for emotion analysis. However, due to the need of a huge amount of training data for a high-quality machine learning model, computational complexity becomes a major bottleneck. To overcome this issue, brain-inspired hyperdimensional (HD) computing, an energy-efficient and fast learning computational paradigm, has a high potential to achieve a balance between accuracy and the amount of necessary training data. We propose an HD Computing-based Multimodality Emotion Recognition (HDC-MER). HDCMER maps real-valued features to binary HD vectors using a random nonlinear function, and further encodes them over time, and fuses across different modalities including GSR, ECG, and EEG. The experimental results show that, compared to the best method using the full training data, HDC-MER achieves higher classification accuracy for both valence (83.2% vs. 80.1%) and arousal (70.1% vs. 68.4%) using only 1/4 training data. HDC-MER also achieves at least 5% higher averaged accuracy compared to all the other methods in any point along the learning curve.
2019
Proceedings 2019 IEEE International Conference on Artificial Intelligence Circuits and Systems, AICAS 2019
137
141
Chang E.-J., Rahimi A., Benini L., Wu A.-Y.A. (2019). Hyperdimensional Computing-based Multimodality Emotion Recognition with Physiological Signals. New York : Institute of Electrical and Electronics Engineers Inc. [10.1109/AICAS.2019.8771622].
Chang E.-J.; Rahimi A.; Benini L.; Wu A.-Y.A.
File in questo prodotto:
File Dimensione Formato  
Hyperdimensional Computing-based Multimodality Emotion.pdf

accesso riservato

Descrizione: Articolo versione editoriale
Tipo: Versione (PDF) editoriale
Licenza: Licenza per accesso riservato
Dimensione 814.57 kB
Formato Adobe PDF
814.57 kB Adobe PDF   Visualizza/Apri   Contatta l'autore
Hyperdimensional Computing-based Multimodality Emotion Recognition.pdf

Open Access dal 25/01/2020

Tipo: Postprint
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione - Non commerciale (CCBYNC)
Dimensione 372.83 kB
Formato Adobe PDF
372.83 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/729767
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 58
  • ???jsp.display-item.citation.isi??? 46
social impact