Photoplethysmography (PPG) sensors allow for non-invasive and comfortable heart-rate (HR) monitoring, suitable for compact wrist-worn devices. Unfortunately, Motion Artifacts (MAs) severely impact the monitoring accuracy, causing high variability in the skin-to-sensor interface. Several data fusion techniques have been introduced to cope with this problem, based on combining PPG signals with inertial sensor data. Until know, both commercial and reasearch solutions are computationally efficient but not very robust, or strongly dependent on hand-tuned parameters, which leads to poor generalization performance. % In this work, we tackle these limitations by proposing a computationally lightweight yet robust deep learning-based approach for PPG-based HR estimation. Specifically, we derive a diverse set of Temporal Convolutional Networks (TCN) for HR estimation, leveraging Neural Architecture Search (NAS). Moreover, we also introduce ActPPG, an adaptive algorithm that selects among multiple HR estimators depending on the amount of MAs, to improve energy efficiency. We validate our approaches on two benchmark datasets, achieving as low as 3.84 Beats per Minute (BPM) of Mean Absolute Error (MAE) on PPGDalia, which outperforms the previous state-of-the-art. Moreover, we deploy our models on a low-power commercial microcontroller (STM32L4), obtaining a rich set of Pareto optimal solutions in the complexity vs. accuracy space.

Burrello, A., Jahier Pagliari, D., Rapa, P.M., Semilia, M., Risso, M., Polonelli, T., et al. (2022). Embedding Temporal Convolutional Networks for Energy-efficient PPG-based Heart Rate Monitoring. ACM TRANSACTIONS ON COMPUTING FOR HEALTHCARE, 3(2), 1-25 [10.1145/3487910].

Embedding Temporal Convolutional Networks for Energy-efficient PPG-based Heart Rate Monitoring

Alessio Burrello;Pierangelo Maria Rapa;Tommaso Polonelli;Luca Benini;
2022

Abstract

Photoplethysmography (PPG) sensors allow for non-invasive and comfortable heart-rate (HR) monitoring, suitable for compact wrist-worn devices. Unfortunately, Motion Artifacts (MAs) severely impact the monitoring accuracy, causing high variability in the skin-to-sensor interface. Several data fusion techniques have been introduced to cope with this problem, based on combining PPG signals with inertial sensor data. Until know, both commercial and reasearch solutions are computationally efficient but not very robust, or strongly dependent on hand-tuned parameters, which leads to poor generalization performance. % In this work, we tackle these limitations by proposing a computationally lightweight yet robust deep learning-based approach for PPG-based HR estimation. Specifically, we derive a diverse set of Temporal Convolutional Networks (TCN) for HR estimation, leveraging Neural Architecture Search (NAS). Moreover, we also introduce ActPPG, an adaptive algorithm that selects among multiple HR estimators depending on the amount of MAs, to improve energy efficiency. We validate our approaches on two benchmark datasets, achieving as low as 3.84 Beats per Minute (BPM) of Mean Absolute Error (MAE) on PPGDalia, which outperforms the previous state-of-the-art. Moreover, we deploy our models on a low-power commercial microcontroller (STM32L4), obtaining a rich set of Pareto optimal solutions in the complexity vs. accuracy space.
2022
Burrello, A., Jahier Pagliari, D., Rapa, P.M., Semilia, M., Risso, M., Polonelli, T., et al. (2022). Embedding Temporal Convolutional Networks for Energy-efficient PPG-based Heart Rate Monitoring. ACM TRANSACTIONS ON COMPUTING FOR HEALTHCARE, 3(2), 1-25 [10.1145/3487910].
Burrello, Alessio; Jahier Pagliari, Daniele; Rapa, PIERANGELO MARIA; Semilia, Matilde; Risso, Matteo; Polonelli, Tommaso; Poncino, Massimo; Benini, Lu...espandi
File in questo prodotto:
File Dimensione Formato  
ACMHealth___PPGAdaptive.pdf

accesso aperto

Tipo: Versione (PDF) editoriale
Licenza: Licenza per accesso libero gratuito
Dimensione 1.55 MB
Formato Adobe PDF
1.55 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/900484
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 10
  • ???jsp.display-item.citation.isi??? ND
social impact