Research Highlight: Supramodal processing optimizes visual perceptual learning and plasticity

The ENP NeuroSpin MEG / Brain Dynamics Van Wassenhove team discovered that when auditory and visual sensory information shares the same temporal structure, visual discrimination is selectively enhanced (Zilber et al, 2014). Multisensory interactions are ubiquitous in cortex and it has been suggested that some sensory cortices may be supramodal i.e. capable of functional selectivity irrespective of the sensory modality of inputs. In  this MEG study, we showed that supramodal processing optimizes visual perceptual learning by capitalizing on invariant representations constructed on the basis of temporal congruence across sensory modalities. Two main drivers were prefrontal cortices and motion area MT. These results support current efforts for feature mapping across sensory modalities.

Check out the article:

Zilber, N., Ciuciu, P., Gramfort, A., Azizi, L., & Van Wassenhove, V. (2014). Supramodal processing optimizes visual perceptual learning and plasticity. Neuroimage, 93, 32-46. 

Attached as: 2014_Zilber_etal_NIMG-2014

http://www.sciencedirect.com/science/article/pii/S1053811914001165

doi:10.1016/j.neuroimage.2014.02.017