The Handbook of Speech Perception. Группа авторов

Чтение книги онлайн.

Читать онлайн книгу The Handbook of Speech Perception - Группа авторов страница 37

The Handbook of Speech Perception - Группа авторов

Скачать книгу

Alsius et al., 2005). Finally, there is evidence that has been interpreted to demonstrate that integration is not complete. For example, when subjects are asked to shadow a McGurk‐effect stimulus (e.g. responding ada when presented audio /aba/ and visual /aga/), their ada shadowed response will show articulatory remnants of the individual audio (/aba/) and video (/aga/) components (Gentilucci & Cattaneo, 2005).

      In principle, all of these findings are inconsistent with a supramodal account. While we have provided alternative interpretations of these findings both in the current paper and elsewhere (e.g. Rosenblum, 2019), it is clear that more research is needed to test the viability of the supramodal account.

      1 Alsius, A., Navarra, J., Campbell, R., & Soto‐Faraco, S. (2005). Audiovisual integration of speech falters under high attention demands. Current Biology, 15(9), 839–843.

      2 Alsius, A., Navarra, J., & Soto‐Faraco, S. (2007). Attention to touch weakens audiovisual speech integration. Experimental Brain Research, 183(3), 399–404.

      3 Alsius, A., Paré, M., & Munhall, K. G. (2017). Forty years after Hearing lips and seeing voices: The McGurk effect revisited. Multisensory Research, 31(1–2), 111–144.

      4  Altieri, N., Pisoni, D. B., & Townsend, J. T. (2011). Some behavioral and neurobiological constraints on theories of audiovisual speech integration: A review and suggestions for new directions. Seeing and Perceiving, 24(6), 513–539.

      5 Andersen, T. S., Tiippana, K., Laarni, J., et al. (2009). The role of visual spatial attention in audiovisual speech perception. Speech Communication, 29, 184–193.

      6 Arnal, L. H., Morillon, B., Kell, C. A., & Giraud, A. L. (2009). Dual neural routing of visual facilitation in speech processing. Journal of Neuroscience, 29(43), 13445–13453.

      7 Arnold, P., & Hill, F. (2001). Bisensory augmentation: A speechreading advantage when speech is clearly audible and intact. British Journal of Psychology, 92(2), 339–355.

      8 Auer, E. T., Bernstein, L. E., Sungkarat, W., & Singh, M. (2007). Vibrotactile activation of the auditory cortices in deaf versus hearing adults. Neuroreport, 18(7), 645–648.

      9 Barker, J. P., & Berthommier, F. (1999). Evidence of correlation between acoustic and visual features of speech. In J. J. Ohala, Y. Hasegawa, M. Ohala, et al. (Eds), Proceedings of the XIVth International Congress of Phonetic Sciences (pp. 5–9). Berkeley: University of California.

      10 Barutchu, A., Crewther, S. G., Kiely, P., et al. (2008). When/b/ill with/g/ill becomes/d/ill: Evidence for a lexical effect in audiovisual speech perception. European Journal of Cognitive Psychology, 20(1), 1–11.

      11 Baum, S., Martin, R. C., Hamilton, A. C., & Beauchamp, M. S. (2012). Multisensory speech perception without the left superior temporal sulcus. Neuroimage, 62(3), 1825–1832.

      12 Baum, S. H., & Beauchamp, M. S. (2014). Greater BOLD variability in older compared with younger adults during audiovisual speech perception. PLOS One, 9(10), 1–10.

      13 Beauchamp, M. S., Nath, A. R., & Pasalar, S. (2010). fMRI‐guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect. Journal of Neuroscience, 30(7), 2414–2417.

      14 Bernstein, L. E., Auer, E. T., Jr., Eberhardt, S. P., & Jiang, J. (2013). Auditory perceptual learning for speech perception can be enhanced by audiovisual training. Frontiers in Neuroscience, 7, 1–16.

      15 Bernstein, L. E., Auer, E. T., Jr., & Moore, J. K. (2004). Convergence or association? In G. A. Calvert, C. Spence, & B. E. Stein (Eds), Handbook of multisensory processes (pp. 203–220). Cambridge, MA: MIT Press.

      16 Bernstein, L. E., Auer, E. T., Jr., & Takayanagi, S. (2004). Auditory speech detection in noise enhanced by lipreading. Speech Communication, 44(1–4), 5–18.

      17 Bernstein, L. E., Eberhardt, S. P., & Auer, E. T. (2014). Audiovisual spoken word training can promote or impede auditory‐only perceptual learning: Prelingually deafened adults with late‐acquired cochlear implants versus normal hearing adults. Frontiers in Psychology, 5, 1–20.

      18 Bernstein, L. E., Jiang, J., Pantazis, D., et al. (2011). Visual phonetic processing localized using speech and nonspeech face gestures in video and point‐light displays. Human Brain Mapping, 32(10), 1660–1676.

      19 Bertelson, P., & de Gelder, B. (2004). The psychology of multi‐sensory perception. In C. Spence & J. Driver (Eds), Crossmodal space and crossmodal attention (pp. 141–177). Oxford: Oxford University Press.

      20 Bertelson, P., Vroomen, J., Wiegeraad, G., & de Gelder, B. (1994). Exploring the relation between McGurk interference and ventriloquism. In Proceedings of the Third International Congress on Spoken Language Processing (pp. 559–562). Yokohama: Acoustical Society of Japan.

      21 Besle, J., Fort, A., Delpuech, C., & Giard, M. H. (2004). Bimodal speech: Early suppressive visual effects in human auditory cortex. European Journal of Neuroscience, 20(8), 2225–2234.

      22 Besle, J., Fischer, C., Bidet‐Caulet, A., et al. (2008). Visual activation and audiovisual interactions in the auditory cortex during speech perception: Intracranial recordings in humans. Journal of Neuroscience, 24, 14301–14310.

      23 Bishop, C. W., & Miller, L. M. (2011). Speech cues contribute to audiovisual spatial integration. PLOS One, 6(8), e24016.

      24 Borrie, S. A., McAuliffe, M. J., Liss, J. M., et al. (2013). The role of linguistic and indexical information in improved recognition of dysarthric speech. Journal of the Acoustical Society of America, 133(1), 474–482.

      25 Brancazio, L. (2004). Lexical influences in audiovisual speech perception. Journal of Experimental Psychology: Human Perception and Performance, 30(3), 445–463.

      26 Brancazio, L., Best, C. T., & Fowler, C. A. (2006). Visual influences on perception of speech and nonspeech vocal‐tract events. Language and Speech, 49(1), 21–53.

      27 Brancazio, L., & Miller, J. L. (2005). Use of visual information in speech perception: Evidence for a visual rate effect both with and without a McGurk effect. Attention, Perception, & Psychophysics, 67(5), 759–769.

      28 Brancazio, L., Miller, J. L., & Paré, M. A. (2003). Visual influences on the internal structure of phonetic categories. Perception & Psychophysics, 65(4), 591–601.

      29 Brown, V., Hedayati, M., Zanger, A., et al. (2018). What accounts for individual differences in susceptibility to the McGurk effect? PLOS ONE, 13(11), e0207160.

      30 Burnham, D., Ciocca, V., Lauw, C., et al. (2000). Perception of visual information for Cantonese tones. In M. Barlow & P. Rose (Eds), Proceedings of the Eighth Australian International Conference on Speech Science and Technology (pp. 86–91). Canberra: Australian Speech Science and Technology Association.

      31 Burnham, D. K., & Dodd, B. (2004). Auditory–visual speech integration by prelinguistic infants: Perception of an emergent consonant in the McGurk effect. Developmental Psychobiology, 45(4), 204–220.

      32 Callan, D. E., Callan, A. M., Kroos, C., & Eric Vatikiotis‐Bateson. (2001). Multimodal contribution to speech perception reveled by independant component analysis: A single sweep EEG case study. Cognitive Brain Research, 10(3), 349–353.

      33 Callan, D. E., Jones, J. A., & Callan, A. (2014).

Скачать книгу