This paper presents a novel event-based eye-tracking system deployed on a resource-constrained microcontroller, addressing the challenges of real-time, low-latency, and low-power performance in embedded systems. The system leverages a Dynamic Vision Sensor (DVS), specifically the DVXplorer Micro, with an average temporal resolution of 200 mu s, to capture rapid eye movements with extremely low latency. The system is implemented on a novel low-power and high-performance microcontroller from STMicroelectronics, the STM32N6. The microcontroller features an 800MHz Arm Cortex-M55 core and AI hardware accelerator, the Neural-ART Accelerator, enabling real-time inference with milliwatt power consumption. The paper propose a hardware-aware and sensor-aware compact Convolutional Neuron Network (CNN) optimized for event-based data, deployed at the edge, achieving a mean pupil prediction error of 5.99 pixels and a median error of 5.73 pixels on the Ini-30 dataset. The system achieves an end-to-end inference latency of just 385 mu s and a neural network throughput of 52 Multiply and Accumulate (MAC) operations per cycle while consuming just 155 mu J of energy. This approach allows for the development of a fully embedded, energy-efficient eye-tracking solution suitable for applications such as smart glasses and wearable devices.

Giordano, M., Bonazzi, P., Benini, L., Magno, M. (2025). Sub-Millisecond Event-Based Eye Tracking on a Resource-Constrained Microcontroller. 345 E 47TH ST, NEW YORK, NY 10017 USA : IEEE [10.1109/iwasi66786.2025.11121976].

Sub-Millisecond Event-Based Eye Tracking on a Resource-Constrained Microcontroller

Benini, Luca;Magno, Michele
2025

Abstract

This paper presents a novel event-based eye-tracking system deployed on a resource-constrained microcontroller, addressing the challenges of real-time, low-latency, and low-power performance in embedded systems. The system leverages a Dynamic Vision Sensor (DVS), specifically the DVXplorer Micro, with an average temporal resolution of 200 mu s, to capture rapid eye movements with extremely low latency. The system is implemented on a novel low-power and high-performance microcontroller from STMicroelectronics, the STM32N6. The microcontroller features an 800MHz Arm Cortex-M55 core and AI hardware accelerator, the Neural-ART Accelerator, enabling real-time inference with milliwatt power consumption. The paper propose a hardware-aware and sensor-aware compact Convolutional Neuron Network (CNN) optimized for event-based data, deployed at the edge, achieving a mean pupil prediction error of 5.99 pixels and a median error of 5.73 pixels on the Ini-30 dataset. The system achieves an end-to-end inference latency of just 385 mu s and a neural network throughput of 52 Multiply and Accumulate (MAC) operations per cycle while consuming just 155 mu J of energy. This approach allows for the development of a fully embedded, energy-efficient eye-tracking solution suitable for applications such as smart glasses and wearable devices.
2025
2025 10th International Workshop on Advances in Sensors and Interfaces (IWASI)
1
6
Giordano, M., Bonazzi, P., Benini, L., Magno, M. (2025). Sub-Millisecond Event-Based Eye Tracking on a Resource-Constrained Microcontroller. 345 E 47TH ST, NEW YORK, NY 10017 USA : IEEE [10.1109/iwasi66786.2025.11121976].
Giordano, Marco; Bonazzi, Pietro; Benini, Luca; Magno, Michele
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/1040850
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex ND
social impact