Redirecting attention between different sensory stimuli entails a behavioral cost, and often leads to slower responses compared to attending to the same modality. The modality switch effect is known to explain attentional selection between audition and vision, but whether it extends to other modality pairs still remains an open question. Besides, the underlying neural mechanisms are largely unknown. Here, we wish to explore whether intersensory switching may hold also in visuotactile conditions, and if so, whether it combines with interhemispheric processes that govern bilateral visuotactile behavior. Hence, we extended a previous neurocomputational model describing visual influences on spatial touch, to also account for the temporal dynamics of sequential visual and tactile stimulation, and we used the new model to generate predictions of reaction times to visual, tactile and visuotactile targets. Upon experimentally validating the predictions in a simple reaction time task, we found faster responses to modality repeat rather than to modality switch trials, and to visuotactile rather than to visual-only or tactile-only targets. Besides, short temporal intervals between targets led to greater reaction times and switch costs. The model also recapitulates interhemispheric visuotactile interactions, by correctly predicting delayed detection of modality-specific stimuli delivered contralateral to the responding hand. These results support the notion for canonical computations underlying multisensory interactions, described in a neural network that accounts for visuotactile perception in the spatiotemporal domain.
Di Rosa, E.F., Sheth, A., Yau, J.M., Astolfi, L., Cuppini, C. (2025). Unimanual vs bimanual intersensory switch in visuotactile detection: a multimodal analysis. Granarolo dell'Emilia : Patron Editore.
Unimanual vs bimanual intersensory switch in visuotactile detection: a multimodal analysis
Eleonore F. Di Rosa
;Cristiano Cuppini
2025
Abstract
Redirecting attention between different sensory stimuli entails a behavioral cost, and often leads to slower responses compared to attending to the same modality. The modality switch effect is known to explain attentional selection between audition and vision, but whether it extends to other modality pairs still remains an open question. Besides, the underlying neural mechanisms are largely unknown. Here, we wish to explore whether intersensory switching may hold also in visuotactile conditions, and if so, whether it combines with interhemispheric processes that govern bilateral visuotactile behavior. Hence, we extended a previous neurocomputational model describing visual influences on spatial touch, to also account for the temporal dynamics of sequential visual and tactile stimulation, and we used the new model to generate predictions of reaction times to visual, tactile and visuotactile targets. Upon experimentally validating the predictions in a simple reaction time task, we found faster responses to modality repeat rather than to modality switch trials, and to visuotactile rather than to visual-only or tactile-only targets. Besides, short temporal intervals between targets led to greater reaction times and switch costs. The model also recapitulates interhemispheric visuotactile interactions, by correctly predicting delayed detection of modality-specific stimuli delivered contralateral to the responding hand. These results support the notion for canonical computations underlying multisensory interactions, described in a neural network that accounts for visuotactile perception in the spatiotemporal domain.| File | Dimensione | Formato | |
|---|---|---|---|
|
DiRosaShethYauAstolfiCuppini_GNB2025_revised.pdf
accesso aperto
Tipo:
Versione (PDF) editoriale / Version Of Record
Licenza:
Licenza per accesso libero gratuito
Dimensione
548.87 kB
Formato
Adobe PDF
|
548.87 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


