![]() ![]() We investigated entrainment to absent speech sounds when human adults were watching silent videos of a speaker telling an unknown story (see Fig. Again, the critical question we addressed here is how the brain leverages lip reading to support speech processing in everyday situations where listeners do not know the content of the continuous speech signal. It is therefore unclear whether entrainment was driven by lip-reading, by covert production or repetition of the speech segment, by top-down lexical and semantic processes, or by a combination of these factors. Most importantly, the reported syllable-level entrainment occurred only when detailed knowledge of the absent speech content was learned after multiple repetitions of the same stimulus. 10 However, it is unknown which brain areas were really involved in this process because Crosse and colleagues 10 relied on the scalp distribution of the estimated entrainment. 4, 5, 7, 9 Electroencephalography data have provided initial evidence that lip-reading might entrain cortical activity at syllable rate. ![]() 4– 8 Such “speech entrainment” originates mainly from auditory cortices at frequencies matching phrasal (below 1 Hz) and syllable rates (4–8 Hz), and is thought to be essential for speech comprehension. When listening to natural continuous speech sounds, oscillatory cortical activity synchronises with the speech temporal structure in a remarkably faithful way. Here, we tested the hypothesis that silent lip-reading provides sufficient information for the brain to synthesise relevant features of the absent speech. ![]() However, in real-life, we usually have no detailed prior knowledge about what a speaker is going to say. 2, 3 Thus, if prior knowledge about the absent sound is available, auditory information can be internally activated. In fact, seeing silent lip movements articulating simple speech sounds such as vowels or elementary words can activate the auditory cortex when participants are fully aware of what the absent auditory input should be. 1 Human sensitivity to lip movements is so remarkable that even the auditory cortex seems to react to this visual input. In everyday situations, seeing the speaker’s articulatory mouth gestures, here referred to as lip-reading, can help us decode the speech signal. Such a synthesis process may help explain well-documented bottom-up perceptual effects. These findings demonstrate that the brain can synthesize high-level features of absent unknown speech sounds from lip-reading that can facilitate the processing of the auditory input. We also observed entrainment to lip movements at the same low frequency in the right angular gyrus, an area involved in processing biological motion. This entrainment to absent speech was characterized by a speech-to-brain delay of 50–100 ms as when actually listening to speech. Here we show that during silent lip-reading of unknown speech, activity in auditory cortices entrains more to absent speech than to seen lip movements at frequencies below 1 Hz. However, in real-life, one usually has no detailed information about the content of upcoming speech. Neuroimaging investigations have revealed that lip-reading activates auditory cortices in individuals covertly repeating absent-but known-speech. Lip-reading is crucial to understand speech in challenging conditions. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |