![Steffi kayser download](https://kumkoniak.com/50.jpg)
Perception often benefits from additional information in addition to that made available by the primary and task-relevant features. We also asked how prestimulus activity shapes the cue–stimulus combination and found a differential influence on the cross-modal and intramodal combination: while alpha power modulated the relative weight of visual motion and the acoustic cue, it did not modulate the behavioral influence of a visual symbolic cue, pointing to differences in how prestimulus activity shapes the combination of multisensory and abstract cues with task-relevant information. This suggests a common or overlapping physiological correlate of cross-modal and intramodal auxiliary information, pointing to a neural mechanism susceptive to both multisensory and more abstract probabilistic cues. Using multivariate analysis of the EEG data, we show that both the multisensory and preceding visual symbolic cue enhance the encoding of visual motion direction as reflected by cerebral activity arising from occipital regions ∼200–400 ms post-stimulus onset. We investigated this by comparing the influence of the following three types of auxiliary probabilistic cues on visual motion discrimination in humans: (1) acoustic motion, (2) a premotion visual symbolic cue, and (3) a postmotion symbolic cue. That similar benefits can arise from multisensory information and within-modality expectation raises the question of whether the underlying neurophysiological processes are the same or distinct. Perceptual performance in a visual task can be enhanced by simultaneous multisensory information, but can also be enhanced by a symbolic or amodal cue inducing a specific expectation.
![Steffi kayser download](https://kumkoniak.com/50.jpg)