Early Integration Processing between Faces and Vowel Sounds in Human Brain: An MEG Investigation

Itta Nakamura, Yoji Hirano, Naotoshi Ohara, Shogo Hirano, Takefumi Ueno, Rikako Tsuchimoto, Shigenobu Kanba, Toshiaki Onitsuka

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


Objective: Unconscious fast integration of face and voice information is a crucial brain function necessary for communicating effectively with others. Here, we investigated for evidence of rapid face-voice integration in the auditory cortex. Methods: Magnetic fields (P50m and N100m) evoked by visual stimuli (V), auditory stimuli (A) and audiovisual stimuli (VA), i.e. by face, vowel and simultaneous vowel-face stimuli, were recorded in 22 healthy subjects. Magnetoencephalographic data from 28 channels around bilateral auditory cortices were analyzed. Results: In both hemispheres, AV - V showed significantly larger P50m amplitudes than A. Additionally, compared with A, the N100m amplitudes and dipole moments of AV - V were significantly smaller in the left hemisphere, but not in the right hemisphere. Conclusions: Differential changes in P50m (bilateral) and N100m (left hemisphere) that occur when V (faces) are associated with A (vowel sounds) indicate that AV (face-voice) integration occurs in early processing, likely enabling us to communicate effectively in our lives.

Original languageEnglish
Pages (from-to)187-195
Number of pages9
Issue number4
Publication statusPublished - Sept 1 2015

All Science Journal Classification (ASJC) codes

  • Neuropsychology and Physiological Psychology
  • Psychiatry and Mental health
  • Biological Psychiatry


Dive into the research topics of 'Early Integration Processing between Faces and Vowel Sounds in Human Brain: An MEG Investigation'. Together they form a unique fingerprint.

Cite this