Background: Visual perceptual learning plays a crucial role in shaping our understanding of how the human brain integrates visual cues to construct coherent perceptual experiences. The visual system is continually challenged to integrate a multitude of visual cues, including form and motion, to create a unified representation of the surrounding visual scene. This process involves both the processing of local signals and their integration into a coherent global percept. Over the past several decades, researchers have explored the mechanisms underlying this integration, focusing on concepts such as internal noise and sampling efficiency, which pertain to local and global processing, respectively. Objectives and Methods: In this study, we investigated the influence of visual perceptual learning on non-directional motion processing using dynamic Glass patterns (GPs) and modified Random-Dot Kinematograms (mRDKs). We also explored the mechanisms of learning transfer to different stimuli and tasks. Specifically, we aimed to assess whether visual perceptual learning based on illusory directional motion, triggered by form and motion cues (dynamic GPs), transfers to stimuli that elicit comparable illusory motion, such as mRDKs. Additionally, we examined whether training on form and motion coherence thresholds improves internal noise filtering and sampling efficiency. Results: Our results revealed significant learning effects on the trained task, enhancing the perception of dynamic GPs. Furthermore, there was a substantial learning transfer to the non-trained stimulus (mRDKs) and partial transfer to a different task. The data also showed differences in coherence thresholds between dynamic GPs and mRDKs, with GPs showing lower coherence thresholds than mRDKs. Finally, an interaction between visual stimulus type and session for sampling efficiency revealed that the effect of training session on participants’ performance varied depending on the type of visual stimulus, with dynamic GPs being influenced differently than mRDKs. Conclusion: These findings highlight the complexity of perceptual learning and suggest that the transfer of learning effects may be influenced by the specific characteristics of both the training stimuli and tasks, providing valuable insights for future research in visual processing.
Donato, R., Contillo, A., Campana, G., Roccato, M., Gonçalves, Ó.F., Pavan, A. (2024). Visual Perceptual Learning of Form–Motion Integration: Exploring the Involved Mechanisms with Transfer Effects and the Equivalent Noise Approach. BRAIN SCIENCES, 14(10), 1-18 [10.3390/brainsci14100997].
Visual Perceptual Learning of Form–Motion Integration: Exploring the Involved Mechanisms with Transfer Effects and the Equivalent Noise Approach
Roccato, Marco;Pavan, Andrea
Supervision
2024
Abstract
Background: Visual perceptual learning plays a crucial role in shaping our understanding of how the human brain integrates visual cues to construct coherent perceptual experiences. The visual system is continually challenged to integrate a multitude of visual cues, including form and motion, to create a unified representation of the surrounding visual scene. This process involves both the processing of local signals and their integration into a coherent global percept. Over the past several decades, researchers have explored the mechanisms underlying this integration, focusing on concepts such as internal noise and sampling efficiency, which pertain to local and global processing, respectively. Objectives and Methods: In this study, we investigated the influence of visual perceptual learning on non-directional motion processing using dynamic Glass patterns (GPs) and modified Random-Dot Kinematograms (mRDKs). We also explored the mechanisms of learning transfer to different stimuli and tasks. Specifically, we aimed to assess whether visual perceptual learning based on illusory directional motion, triggered by form and motion cues (dynamic GPs), transfers to stimuli that elicit comparable illusory motion, such as mRDKs. Additionally, we examined whether training on form and motion coherence thresholds improves internal noise filtering and sampling efficiency. Results: Our results revealed significant learning effects on the trained task, enhancing the perception of dynamic GPs. Furthermore, there was a substantial learning transfer to the non-trained stimulus (mRDKs) and partial transfer to a different task. The data also showed differences in coherence thresholds between dynamic GPs and mRDKs, with GPs showing lower coherence thresholds than mRDKs. Finally, an interaction between visual stimulus type and session for sampling efficiency revealed that the effect of training session on participants’ performance varied depending on the type of visual stimulus, with dynamic GPs being influenced differently than mRDKs. Conclusion: These findings highlight the complexity of perceptual learning and suggest that the transfer of learning effects may be influenced by the specific characteristics of both the training stimuli and tasks, providing valuable insights for future research in visual processing.File | Dimensione | Formato | |
---|---|---|---|
Donato et al. 2024.pdf
accesso aperto
Tipo:
Versione (PDF) editoriale
Licenza:
Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione
3.38 MB
Formato
Adobe PDF
|
3.38 MB | Adobe PDF | Visualizza/Apri |
brainsci-3222695-supplementary.pdf
accesso aperto
Tipo:
File Supplementare
Licenza:
Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione
91.06 kB
Formato
Adobe PDF
|
91.06 kB | Adobe PDF | Visualizza/Apri |
Supplementary files_video.zip
accesso aperto
Tipo:
File Supplementare
Licenza:
Licenza per Accesso Aperto. Creative Commons Attribuzione (CCBY)
Dimensione
1 MB
Formato
Zip File
|
1 MB | Zip File | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.