This paper introduces a dataset on Human-cobot collaboration for Action Recognition in Manufacturing Assembly (HARMA3). It is a collection of RGB frames, Depth maps, RGB-to-depth-Aligned (RGB-A) frames and Skeleton data relative to actions performed by different subjects in collaboration with a cobot for building an Epicyclic Gear Train (EGT). In particular, 27 subjects executed several trials of the assembly task, which consisted of 7 actions. Data were collected in a laboratory scenario using two Microsoft® Azure Kinect cameras positioned in frontal and lateral positions. The dataset represents a good foundation for developing and testing advanced action recognition as well as action segmentation systems with far-reaching implications beyond human-cobot collaboration. Further potential applications include Computer Vision, Machine Learning, and Smart Manufacturing. Preliminary experiments for action segmentation by applying a state-of-the-art method on features extracted from RGB and skeletal data are presented in this paper, showing high-performance rates.

Romeo, L., Maselli, M.V., García Domínguez, M., Marani, R., Lavit Nicora, M., Cicirelli, G., et al. (2024). A Dataset on Human-Cobot Collaboration for Action Recognition in Manufacturing Assembly. 345 E 47TH ST, NEW YORK, NY 10017 USA : Institute of Electrical and Electronics Engineers Inc. [10.1109/codit62066.2024.10708143].

A Dataset on Human-Cobot Collaboration for Action Recognition in Manufacturing Assembly

Lavit Nicora, Matteo;
2024

Abstract

This paper introduces a dataset on Human-cobot collaboration for Action Recognition in Manufacturing Assembly (HARMA3). It is a collection of RGB frames, Depth maps, RGB-to-depth-Aligned (RGB-A) frames and Skeleton data relative to actions performed by different subjects in collaboration with a cobot for building an Epicyclic Gear Train (EGT). In particular, 27 subjects executed several trials of the assembly task, which consisted of 7 actions. Data were collected in a laboratory scenario using two Microsoft® Azure Kinect cameras positioned in frontal and lateral positions. The dataset represents a good foundation for developing and testing advanced action recognition as well as action segmentation systems with far-reaching implications beyond human-cobot collaboration. Further potential applications include Computer Vision, Machine Learning, and Smart Manufacturing. Preliminary experiments for action segmentation by applying a state-of-the-art method on features extracted from RGB and skeletal data are presented in this paper, showing high-performance rates.
2024
10th 2024 International Conference on Control, Decision and Information Technologies, CoDIT 2024
866
871
Romeo, L., Maselli, M.V., García Domínguez, M., Marani, R., Lavit Nicora, M., Cicirelli, G., et al. (2024). A Dataset on Human-Cobot Collaboration for Action Recognition in Manufacturing Assembly. 345 E 47TH ST, NEW YORK, NY 10017 USA : Institute of Electrical and Electronics Engineers Inc. [10.1109/codit62066.2024.10708143].
Romeo, Laura; Maselli, Marco Vincenzo; García Domínguez, M.; Marani, Roberto; Lavit Nicora, Matteo; Cicirelli, Grazia; Malosio, Matteo; D'Orazio, Tizi...espandi
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/1007658
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact