Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension that allows to train discrete deep neural networks is also investigated.

Baldassi, C., Gerace, F., Kappen, H.J., Lucibello, C., Saglietti, L., Tartaglione, E., et al. (2018). Role of Synaptic Stochasticity in Training Low-Precision Neural Networks. PHYSICAL REVIEW LETTERS, 120(26), 1-4 [10.1103/PhysRevLett.120.268103].

Role of Synaptic Stochasticity in Training Low-Precision Neural Networks

Gerace F.;Tartaglione E.;
2018

Abstract

Stochasticity and limited precision of synaptic weights in neural network models are key aspects of both biological and hardware modeling of learning processes. Here we show that a neural network model with stochastic binary weights naturally gives prominence to exponentially rare dense regions of solutions with a number of desirable properties such as robustness and good generalization performance, while typical solutions are isolated and hard to find. Binary solutions of the standard perceptron problem are obtained from a simple gradient descent procedure on a set of real values parametrizing a probability distribution over the binary synapses. Both analytical and numerical results are presented. An algorithmic extension that allows to train discrete deep neural networks is also investigated.
2018
Baldassi, C., Gerace, F., Kappen, H.J., Lucibello, C., Saglietti, L., Tartaglione, E., et al. (2018). Role of Synaptic Stochasticity in Training Low-Precision Neural Networks. PHYSICAL REVIEW LETTERS, 120(26), 1-4 [10.1103/PhysRevLett.120.268103].
Baldassi, C.; Gerace, F.; Kappen, H. J.; Lucibello, C.; Saglietti, L.; Tartaglione, E.; Zecchina, R.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/1028256
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 3
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 16
social impact