In this paper we present a modification of the strongly diluted Hopfield model in which the dilution scheme, instead of being random, is inspired on biological grounds. We also show that the resulting sparsely connected neural network shows striking features in terms of performance as an associative memory device, even for noisy correlated patterns. We state analytically that under certain conditions the model does not have any upper value of the parameter α = p/C (where p is the number of patterns stored in the system and C is the coordination number of each neuron) beyond which it becomes useless as a memory device, as it is found for other models. By means of numerical simulations we demonstrate that under much weaker assumptions the neural network still performs highly efficiently in good agreement with the analytical results. © 2001 Elsevier Science B.V.

Montemurro M.A., Tamarit F.A. (2001). An efficient dilution strategy for constructing sparsely connected neural networks. PHYSICA. A, 294(3-4), 340-350 [10.1016/S0378-4371(01)00123-6].

An efficient dilution strategy for constructing sparsely connected neural networks

Montemurro M. A.
Membro del Collaboration Group
;
2001

Abstract

In this paper we present a modification of the strongly diluted Hopfield model in which the dilution scheme, instead of being random, is inspired on biological grounds. We also show that the resulting sparsely connected neural network shows striking features in terms of performance as an associative memory device, even for noisy correlated patterns. We state analytically that under certain conditions the model does not have any upper value of the parameter α = p/C (where p is the number of patterns stored in the system and C is the coordination number of each neuron) beyond which it becomes useless as a memory device, as it is found for other models. By means of numerical simulations we demonstrate that under much weaker assumptions the neural network still performs highly efficiently in good agreement with the analytical results. © 2001 Elsevier Science B.V.
2001
Montemurro M.A., Tamarit F.A. (2001). An efficient dilution strategy for constructing sparsely connected neural networks. PHYSICA. A, 294(3-4), 340-350 [10.1016/S0378-4371(01)00123-6].
Montemurro M.A.; Tamarit F.A.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/771099
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 5
social impact