The Handbook of Speech Perception. Группа авторов

Чтение книги онлайн.

Читать онлайн книгу The Handbook of Speech Perception - Группа авторов страница 38

The Handbook of Speech Perception - Группа авторов

Скачать книгу

D. E., Jones, J. A., Callan, A. M., & Akahane‐Yamada, R. (2004). Phonetic perceptual identification by native‐and second‐language speakers differentially activates brain regions involved with acoustic phonetic processing and those involved with articulatory–auditory/orosensory internal models. NeuroImage, 22(3), 1182–1194.

      35 Callan, D. E., Jones, J. A., Munhall, K., et al. (2003). Neural processes underlying perceptual enhancement by visual speech gestures. Neuroreport, 14(17), 2213–2218.

      36 Calvert, G. A, Bullmore, E. T., Brammer, M. J., et al. (1997). Activation of auditory cortex during silent lipreading. Science, 276(5312), 593–596.

      37 Campbell, R. (2011). Speechreading: What’s missing. In A. Calder (Ed.), Oxford handbook of face perception (pp. 605–630). Oxford: Oxford University Press.

      38 Chandrasekaran, C., Trubanova, A., Stillittano, S., et al. (2009). The natural statistics of audiovisual speech. PLOS Computational Biology, 5(7), 1–18.

      39 Cienkowski, K. M., & Carney, A. E. (2002). Auditory–visual speech perception and aging, Ear and Hearing, 23, 439–449.

      40 Colin, C., Radeau, M., Deltenre, P., et al. (2002). The role of sound intensity and stop‐consonant voicing on McGurk fusions and combinations. European Journal of Cognitive Psychology, 14, 475–491.

      41 Connine, C. M., & Clifton, C., Jr. (1987). Interactive use of lexical information in speech perception. Journal of Experimental Psychology: Human Perception and Performance, 13(2), 291–299.

      42 Danielson, D. K., Bruderer, A. G., Kandhadai, P., et al. (2017). The organization and reorganization of audiovisual speech perception in the first year of life. Cognitive Development, 42, 37–48.

      43 D’Ausilio, A., Bartoli, E., Maffongelli, L., et al. (2014). Vision of tongue movements bias auditory speech perception. Neuropsychologia, 63, 85–91.

      44 Delvaux, V., Huet, K., Piccaluga, M., & Harmegnies, B. (2018). The perception of anticipatory labial coarticulation by blind listeners in noise: A comparison with sighted listeners in audio‐only, visual‐only and audiovisual conditions. Journal of Phonetics, 67, 65–77.

      45 Derrick, D., & Gick, B. (2013). Aerotactile integration from distal skin stimuli. Multisensory Research, 26(5), 405–416.

      46 Desjardins, R. N., & Werker, J. F. (2004). Is the integration of heard and seen speech mandatory for infants? Developmental Psychobiology, 45(4), 187–203.

      47 Dias, J. W., & Rosenblum, L. D. (2011). Visual influences on interactive speech alignment. Perception, 40, 1457–1466.

      48 Dias, J. W., & Rosenblum, L. D. (2016). Visibility of speech articulation enhances auditory phonetic convergence. Attention, Perception, & Psychophysics, 78, 317–333.

      49 Diehl, R. L., & Kluender, K. R. (1989). On the objects of speech perception. Ecological Psychology, 1, 121–144.

      50 Dorsi, J., Rosenblum, L. D., Dias, J. W., & Ashkar, D. (2016). Can audio‐haptic speech be used to train better auditory speech perception? Journal of the Acoustical Society of America, 139(4), 2016–2017.

      51 Dorsi, J., Rosenblum, L. D., & Ostrand, R. (2017). What you see isn’t always what you get, or is it? Reexamining semantic priming from McGurk stimuli. Poster presented at the 58th Meeting of the Psychonomics Society, Vancouver, Canada, November 10.

      52 Eberhardt, S. P., Auer, E. T., & Bernstein, L. E. (2014). Multisensory training can promote or impede visual perceptual learning of speech stimuli: Visual‐tactile vs. visual‐auditory training. Frontiers in Human Neuroscience, 8, 1–23.

      53 Eskelund, K., MacDonald, E. N., & Andersen, T. S. (2015). Face configuration affects speech perception: Evidence from a McGurk mismatch negativity study. Neuropsychologia, 66, 48–54.

      54 Eskelund, K., Tuomainen, J., & Andersen, T. S. (2011). Multistage audiovisual integration of speech: Dissociating identification and detection. Experimental Brain Research, 208(3), 447–457.

      55 Fingelkurts, A. A., Fingelkurts, A. A., Krause, C. M., et al. (2003). Cortical operational synchrony during audio–visual speech integration. Brain and Language, 85(2), 297–312.

      56 Fowler, C. A. (1986). An event approach to the study of speech perception from a direct‐realist perspective. Journal of Phonetics, 14, 3–28.

      57 Fowler, C. A. (2004). Speech as a supramodal or amodal phenomenon. In G. Calvert, C. Spence, & B. E. Stein (Eds), Handbook of multisensory processes (pp. 189–201). Cambridge, MA: MIT Press.

      58  Fowler, C. A. (2010). Embodied, embedded language use. Ecological Psychology, 22(4), 286–303.

      59 Fowler, C. A., Brown, J. M., & Mann, V. A. (2000). Contrast effects do not underlie effects of preceding liquids on stop‐consonant identification by humans. Journal of Experimental Psychology: Human Perception and Performance, 26(3), 877–888.

      60 Fowler, C. A., & Dekle, D. J. (1991). Listening with eye and hand: Cross‐modal contributions to speech perception. Journal of Experimental Psychology: Human Perception and Performance, 17(3), 816–828.

      61 Fuster‐Duran, A. (1996). Perception of conflicting audio‐visual speech: An examination across Spanish and German. In D. G. Stork & M. E. Hennecke (Eds), Speechreading by humans and machines (pp. 135–143). Berlin: Springer.

      62 Ganong, W. F. (1980). Phonetic categorization in auditory word perception. Journal of Experimental Psychology: Human Perception and Performance, 6(1), 110–125.

      63 Gentilucci, M., & Cattaneo, L. (2005). Automatic audiovisual integration in speech perception. Experimental Brain Research, 167(1), 66–75.

      64 Ghazanfar, A. A., Maier, J. X., Hoffman, K. L., & Logothetis, N. K. (2005). Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. Journal of Neuroscience, 25(20), 5004–5012.

      65 Gibson, J. J. (1966). The senses considered as perceptual systems. Boston: Houghton Mifflin.

      66 Gibson, J. J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin.

      67 Gick, B., & Derrick, D. (2009). Aero‐tactile integration in speech perception. Nature, 462(7272), 502–504.

      68 Gick, B., Jóhannsdóttir, K. M., Gibraiel, D., & Mühlbauer, J. (2008). Tactile enhancement of auditory and visual speech perception in untrained perceivers. Journal of the Acoustical Society of America, 123(4), 72–76.

      69 Gordon, P. C. (1997). Coherence masking protection in speech sounds: The role of formant synchrony. Perception & Psychophysics, 59, 232–242.

      70 Grant, K. W. (2001). The effect of speechreading on masked detection thresholds for filtered speech. Journal of the Acoustical Society of America, 109(5), 2272–2275.

      71 Grant, K. W., & Seitz, P. F. (1998). Measures of auditory‐visual integration in nonsense syllables and sentences. Journal of the Acoustical Society of America, 104, 2438–2450.

      72 Grant, K. W., & Seitz, P. F. P. (2000). The use of visible speech cues for improving auditory detection of spoken sentences. Journal of the Acoustical Society of America, 108(3), 1197–1208.

      73 Green, K. P., & Gerdeman, A. (1995). Cross‐modal discrepancies

Скачать книгу