According with a featural organization of semantic memory, this work is aimed at investigating, through an attractor network, the role of different kinds of features in the representation of concepts, both in normal and neurodegenerative conditions. We implemented new synaptic learning rules in order to take into account the role of partially shared features and of distinctive features with different saliency. The model includes semantic and lexical layers, coding, respectively for object features and word-forms. Connections among nodes are strongly asymmetrical. To account for the feature saliency, asymmetrical synapses were created using Hebbian rules of potentiation and depotentiation, setting different pre-synaptic and post-synaptic thresholds. A variable post-synaptic threshold, which automatically changed to reflect the feature frequency in different concepts (i.e., how many concepts share a feature), was used to account for partially shared features. The trained network solved naming tasks and word recognition tasks very well, exploiting the different role of salient versus marginal features in concept identification. In the case of damage, superordinate concepts were preserved better than the subordinate ones. Interestingly, the degradation of salient features, but not of marginal ones, prevented object identification. The model suggests that Hebbian rules, with adjustable post-synaptic thresholds, can provide a reliable semantic representation of objects exploiting the statistics of input features.
Ursino, M., Cuppini, C., Cappa, S.F., Catricalà, E. (2018). A feature-based neurocomputational model of semantic memory. COGNITIVE NEURODYNAMICS, 12(6), 525-547 [10.1007/s11571-018-9494-0].
A feature-based neurocomputational model of semantic memory
Ursino, Mauro
;Cuppini, Cristiano;
2018
Abstract
According with a featural organization of semantic memory, this work is aimed at investigating, through an attractor network, the role of different kinds of features in the representation of concepts, both in normal and neurodegenerative conditions. We implemented new synaptic learning rules in order to take into account the role of partially shared features and of distinctive features with different saliency. The model includes semantic and lexical layers, coding, respectively for object features and word-forms. Connections among nodes are strongly asymmetrical. To account for the feature saliency, asymmetrical synapses were created using Hebbian rules of potentiation and depotentiation, setting different pre-synaptic and post-synaptic thresholds. A variable post-synaptic threshold, which automatically changed to reflect the feature frequency in different concepts (i.e., how many concepts share a feature), was used to account for partially shared features. The trained network solved naming tasks and word recognition tasks very well, exploiting the different role of salient versus marginal features in concept identification. In the case of damage, superordinate concepts were preserved better than the subordinate ones. Interestingly, the degradation of salient features, but not of marginal ones, prevented object identification. The model suggests that Hebbian rules, with adjustable post-synaptic thresholds, can provide a reliable semantic representation of objects exploiting the statistics of input features.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.