Neuro-decoders have been developed by researchers mostly to control neuro-prosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial posterior parietal cortex (PPC). Two Macaca fascicularis were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light towards objects of different shapes. Population neural activity was extracted at various time intervals; on vision of the objects, the delay before movement, and grasp execution. This activity was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well over chance level for all the epochs analyzed in this study. Furthermore, we detected slightly different decoding accuracies depending on the task's visual condition. Generalization analysis was performed by training and testing the system during different time intervals. This analysis demonstrated that a change of code occurred during the course of the task. Noteworthy, the ability of our classifier to discriminate grasp types was fairly well in advance with respect to grasping onset. This feature might be important when the timing is critical to send signals to external devices before the movement start. Our results suggest that the neural signals from the dorsomedial visual pathway can be a good substrate to feed neural prostheses for prehensile actions.SIGNIFICANCE STATEMENTRecordings of neural activity from nonhuman primate frontal and parietal cortex have led to the development of methods of decoding movement information in order to restore coordinated arm actions in paralyzed human beings. Our results show that the signals measured from the monkey medial posterior parietal cortex are valid for correctly decoding information relevant for grasping. Together with the previous studies on decoding reach trajectories from medial posterior parietal cortex, this highlights the medial parietal cortex as a target site for transforming neural activity into control signals to command prostheses to allow human patients to dexterously perform grasping actions.

Decoding information for grasping from the macaque dorsomedial visual stream / Filippini, Matteo; Breveglieri, Rossella; Akhras, M Ali; Bosco, Annalisa; Chinellato, Eris; Fattori, Patrizia. - In: THE JOURNAL OF NEUROSCIENCE. - ISSN 1529-2401. - STAMPA. - 37:16(2017), pp. 4311-4322. [10.1523/JNEUROSCI.3077-16.2017]

Decoding information for grasping from the macaque dorsomedial visual stream

FILIPPINI, MATTEO;BREVEGLIERI, ROSSELLA;BOSCO, ANNALISA;FATTORI, PATRIZIA
2017

Abstract

Neuro-decoders have been developed by researchers mostly to control neuro-prosthetic devices, but also to shed new light on neural functions. In this study, we show that signals representing grip configurations can be reliably decoded from neural data acquired from area V6A of the monkey medial posterior parietal cortex (PPC). Two Macaca fascicularis were trained to perform an instructed-delay reach-to-grasp task in the dark and in the light towards objects of different shapes. Population neural activity was extracted at various time intervals; on vision of the objects, the delay before movement, and grasp execution. This activity was used to train and validate a Bayes classifier used for decoding objects and grip types. Recognition rates were well over chance level for all the epochs analyzed in this study. Furthermore, we detected slightly different decoding accuracies depending on the task's visual condition. Generalization analysis was performed by training and testing the system during different time intervals. This analysis demonstrated that a change of code occurred during the course of the task. Noteworthy, the ability of our classifier to discriminate grasp types was fairly well in advance with respect to grasping onset. This feature might be important when the timing is critical to send signals to external devices before the movement start. Our results suggest that the neural signals from the dorsomedial visual pathway can be a good substrate to feed neural prostheses for prehensile actions.SIGNIFICANCE STATEMENTRecordings of neural activity from nonhuman primate frontal and parietal cortex have led to the development of methods of decoding movement information in order to restore coordinated arm actions in paralyzed human beings. Our results show that the signals measured from the monkey medial posterior parietal cortex are valid for correctly decoding information relevant for grasping. Together with the previous studies on decoding reach trajectories from medial posterior parietal cortex, this highlights the medial parietal cortex as a target site for transforming neural activity into control signals to command prostheses to allow human patients to dexterously perform grasping actions.
2017
Decoding information for grasping from the macaque dorsomedial visual stream / Filippini, Matteo; Breveglieri, Rossella; Akhras, M Ali; Bosco, Annalisa; Chinellato, Eris; Fattori, Patrizia. - In: THE JOURNAL OF NEUROSCIENCE. - ISSN 1529-2401. - STAMPA. - 37:16(2017), pp. 4311-4322. [10.1523/JNEUROSCI.3077-16.2017]
Filippini, Matteo; Breveglieri, Rossella; Akhras, M Ali; Bosco, Annalisa; Chinellato, Eris; Fattori, Patrizia
File in questo prodotto:
File Dimensione Formato  
FilippiniBreveglieri2017.pdf

accesso aperto

Tipo: Versione (PDF) editoriale
Licenza: Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione 2.83 MB
Formato Adobe PDF
2.83 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/584449
Citazioni
  • ???jsp.display-item.citation.pmc??? 13
  • Scopus 24
  • ???jsp.display-item.citation.isi??? 23
social impact