In recent years, computer vision and deep learning have become increasingly important in the livestock industry, offering innovative animal monitoring and farm management solutions. This paper focuses on the critical task of cattle segmentation, an essential application for weight estimation, body condition scoring, and behavior analysis. Despite advances in segmentation techniques, accurately identifying and isolating cattle in complex farm environments remains challenging due to varying lighting conditions and overlapping objects. This study evaluates state-of-the-art segmentation models based on convolutional neural networks and transformers, which leverage self-attention mechanisms to capture long-range image dependencies. By testing these models across multiple publicly available datasets, we assess their performance and generalization capabilities, providing insights into the most effective methods for accurate cattle segmentation in real-world farm conditions. We also explore ensemble techniques, selecting pairs of segmenters with maximum diversity. The results are promising, as an ensemble of only two models improves performance over all stand-alone methods. The findings contribute to improving computer vision-based solutions for livestock management, enhancing their accuracy and reliability in practical applications.

Lumini, A., Botazzo Rozendo, G., Dadi, M., Franco, A. (2025). Comparison of CNN and Transformer Architectures for Robust Cattle Segmentation in Complex Farm Environments. SciTePress [10.5220/0013176400003905].

Comparison of CNN and Transformer Architectures for Robust Cattle Segmentation in Complex Farm Environments

Alessandra Lumini
;
Guilherme Botazzo Rozendo;Maichol Dadi;Annalisa Franco
2025

Abstract

In recent years, computer vision and deep learning have become increasingly important in the livestock industry, offering innovative animal monitoring and farm management solutions. This paper focuses on the critical task of cattle segmentation, an essential application for weight estimation, body condition scoring, and behavior analysis. Despite advances in segmentation techniques, accurately identifying and isolating cattle in complex farm environments remains challenging due to varying lighting conditions and overlapping objects. This study evaluates state-of-the-art segmentation models based on convolutional neural networks and transformers, which leverage self-attention mechanisms to capture long-range image dependencies. By testing these models across multiple publicly available datasets, we assess their performance and generalization capabilities, providing insights into the most effective methods for accurate cattle segmentation in real-world farm conditions. We also explore ensemble techniques, selecting pairs of segmenters with maximum diversity. The results are promising, as an ensemble of only two models improves performance over all stand-alone methods. The findings contribute to improving computer vision-based solutions for livestock management, enhancing their accuracy and reliability in practical applications.
2025
Proceedings of the 14th International Conference on Pattern Recognition Applications and Methods - ICPRAM
91
102
Lumini, A., Botazzo Rozendo, G., Dadi, M., Franco, A. (2025). Comparison of CNN and Transformer Architectures for Robust Cattle Segmentation in Complex Farm Environments. SciTePress [10.5220/0013176400003905].
Lumini, Alessandra; Botazzo Rozendo, Guilherme; Dadi, Maichol; Franco, Annalisa
File in questo prodotto:
File Dimensione Formato  
CattleSeg_ICPRAM2025.pdf

accesso aperto

Tipo: Postprint / Author's Accepted Manuscript (AAM) - versione accettata per la pubblicazione dopo la peer-review
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione - Non commerciale - Non opere derivate (CCBYNCND)
Dimensione 8.42 MB
Formato Adobe PDF
8.42 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/1045875
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact