Objective. Neural signals can be decoded and used to move neural prostheses with the purpose of restoring motor function in patients with mobility impairments. Such patients typically have intact eye movement control and visual function, suggesting that cortical visuospatial signals could be used to guide external devices. Neurons in parietal cortex mediate sensory-motor transformations, encode the spatial coordinates for reaching goals, hand position and movements, and other spatial variables. We studied how spatial information is represented at the population level, and the possibility to decode not only the position of visual targets and the plans to reach them, but also conditional, non-spatial motor responses. Approach. The animals first fixated one of nine targets in 3D space and then, after the target changed color, either reached toward it, or performed a non-spatial motor response (lift hand from a button). Spiking activity of parietal neurons was recorded in monkeys during two tasks. We then decoded different task related parameters. Main results. We first show that a maximum-likelihood estimation (MLE) algorithm trained separately in each task transformed neural activity into accurate metric predictions of target location. Furthermore, by combining MLE with a Naïve Bayes classifier, we decoded the monkey's motor intention (reach or hand lift) and the different phases of the tasks. These results show that, although V6A encodes the spatial location of a target during a delay period, the signals they carry are updated around the movement execution in an intention/motor specific way. Significance. These findings show the presence of multiple levels of information in parietal cortex that could be decoded and used in brain machine interfaces to control both goal-directed movements and more cognitive visuomotor associations.

Decoding of standard and non-standard visuomotor associations from parietal cortex

Filippini M.
;
Breveglieri R.;Hadjidimitrakis K.;Fattori P.
2020

Abstract

Objective. Neural signals can be decoded and used to move neural prostheses with the purpose of restoring motor function in patients with mobility impairments. Such patients typically have intact eye movement control and visual function, suggesting that cortical visuospatial signals could be used to guide external devices. Neurons in parietal cortex mediate sensory-motor transformations, encode the spatial coordinates for reaching goals, hand position and movements, and other spatial variables. We studied how spatial information is represented at the population level, and the possibility to decode not only the position of visual targets and the plans to reach them, but also conditional, non-spatial motor responses. Approach. The animals first fixated one of nine targets in 3D space and then, after the target changed color, either reached toward it, or performed a non-spatial motor response (lift hand from a button). Spiking activity of parietal neurons was recorded in monkeys during two tasks. We then decoded different task related parameters. Main results. We first show that a maximum-likelihood estimation (MLE) algorithm trained separately in each task transformed neural activity into accurate metric predictions of target location. Furthermore, by combining MLE with a Naïve Bayes classifier, we decoded the monkey's motor intention (reach or hand lift) and the different phases of the tasks. These results show that, although V6A encodes the spatial location of a target during a delay period, the signals they carry are updated around the movement execution in an intention/motor specific way. Significance. These findings show the presence of multiple levels of information in parietal cortex that could be decoded and used in brain machine interfaces to control both goal-directed movements and more cognitive visuomotor associations.
Filippini M.; Morris A.P.; Breveglieri R.; Hadjidimitrakis K.; Fattori P.
File in questo prodotto:
Eventuali allegati, non sono esposti

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11585/789028
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact