Executive Summary : | Speech comprehension is a complex process that involves multiple sensory modalities, including phonemes, syllables, and words. In difficult listening environments, such as noisy environments, speech comprehension can be supported by other sensory modalities, such as lip movements of the speaker. Recent neuroimaging studies have highlighted the role of frontal, parietal, and integrative brain sites in the superior temporal sulcus (STS) for multisensory speech perception. However, the role of the whole brain network during multisensory perception processing and how the audio-visual sensory integration window can be modulated by external perturbations using transcranial electrical stimulations (tES) remains an open question. The neural integration of different sensory streams relies on several processes, including the neural tracking of rhythms in multisensory signals. Individual Alpha Frequency (8-12Hz) is hypothesized to be related to the temporal window of subject-specific audio-visual multisensory integration window. The speed of Individual Alpha Frequency (IAF) could modulate perception of illusions, but the relationship between IAF and modification in subjective illusory perception and variability has not been clearly established.
These findings will have important implications for multisensory research, particularly in aging, congenital blind individuals, and applications in Brain-Computer Interfacing (BCI), neurostimulation, future auditory prosthesis, EEG-based brain state dependent neuroplasticity, and brain networks involved in memory and attention-guided speech processing. |