Temporal attention and comodulation in multisensory causal inference
* Presenting author
Perception relies on inferences about the causal structure of the world provided by multiple sensory inputs. In ecological settings, multisensory events that cohere in time and space benefit such inferential processes: hearing and seeing a speaker enhances speech comprehension, and the acoustic changes of flapping wings naturally pace the visual scene crowded by a flock of birds. I will illustrate by means of psychophysical and magnetoencephalographic (MEG) studies how the human brain may synthesize and represent multisensory temporal comodulation in some abstract form, which may benefit (as top-down predictions) the analysis of incoming sensory signals. For instance, one recent finding (under review) shows the emergence of large-scale synchronizations in high gamma (60-120 Hz) and beta (15-30 Hz) bands following experiencing temporally comodulated multisensory signals. The engagement of prefrontal, parietal, and visual cortices suggest the possibility that a short experience of temporally comodulated stimuli signaling the same information may functionally re-route unisensory processing.