Temporal Convolutional Networks (TCNs) are promising Deep Learning models for time-series processing tasks. One key feature of TCNs is time-dilated convolution, whose optimization requires extensive experimentation. We propose an automatic dilation optimizer, which tackles the problem as a weight pruning on the time-axis, and learns dilation factors together with weights, in a single training. Our method reduces the model size and inference latency on a real SoC hardware target by up to 7.4× and 3×, respectively with no accuracy drop compared to a network without dilation. It also yields a rich set of Pareto-optimal TCNs starting from a single model, outperforming hand-designed solutions in both size and accuracy.

Pruning in Time (PIT): A Lightweight Network Architecture Optimizer for Temporal Convolutional Networks

Burrello A.;Conti F.;Lamberti L.;Benini L.;
2021

Abstract

Temporal Convolutional Networks (TCNs) are promising Deep Learning models for time-series processing tasks. One key feature of TCNs is time-dilated convolution, whose optimization requires extensive experimentation. We propose an automatic dilation optimizer, which tackles the problem as a weight pruning on the time-axis, and learns dilation factors together with weights, in a single training. Our method reduces the model size and inference latency on a real SoC hardware target by up to 7.4× and 3×, respectively with no accuracy drop compared to a network without dilation. It also yields a rich set of Pareto-optimal TCNs starting from a single model, outperforming hand-designed solutions in both size and accuracy.
2021
Proceedings - Design Automation Conference
1015
1020
Risso M.; Burrello A.; Pagliari D.J.; Conti F.; Lamberti L.; MacIi E.; Benini L.; Poncino M.
File in questo prodotto:
File Dimensione Formato  
DAC21___Pruning_in_time_redux.pdf

Open Access dal 08/11/2023

Tipo: Postprint
Licenza: Licenza per accesso libero gratuito
Dimensione 1.66 MB
Formato Adobe PDF
1.66 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/847051
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 5
social impact