Journal of cognitive neuroscience
-
During speech communication, visual information may interact with the auditory system at various processing stages. Most noteworthy, recent magnetoencephalography (MEG) data provided first evidence for early and preattentive phonetic/phonological encoding of the visual data stream--prior to its fusion with auditory phonological features [Hertrich, I., Mathiak, K., Lutzenberger, W., & Ackermann, H. Time course of early audiovisual interactions during speech and non-speech central-auditory processing: An MEG study. ⋯ Using functional magnetic resonance imaging, the present follow-up study aims to further elucidate the topographic distribution of visual-phonological operations and audiovisual (AV) interactions during speech perception. Ambiguous acoustic syllables--disambiguated to /pa/ or /ta/ by the visual channel (speaking face)--served as test materials, concomitant with various control conditions (nonspeech AV signals, visual-only and acoustic-only speech, and nonspeech stimuli). (i) Visual speech yielded an AV-subadditive activation of primary auditory cortex and the anterior superior temporal gyrus (STG), whereas the posterior STG responded both to speech and nonspeech motion. (ii) The inferior frontal and the fusiform gyrus of the right hemisphere showed a strong phonetic/phonological impact (differential effects of visual /pa/ vs. /ta/) upon hemodynamic activation during presentation of speaking faces. Taken together with the previous MEG data, these results point at a dual-pathway model of visual speech information processing: On the one hand, access to the auditory system via the anterior supratemporal “what" path may give rise to direct activation of "auditory objects." On the other hand, visual speech information seems to be represented in a right-hemisphere visual working memory, providing a potential basis for later interactions with auditory information such as the McGurk effect.
-
Meta Analysis
Neuroimaging support for discrete neural correlates of basic emotions: a voxel-based meta-analysis.
What is the basic structure of emotional experience and how is it represented in the human brain? One highly influential theory, discrete basic emotions, proposes a limited set of basic emotions such as happiness and fear, which are characterized by unique physiological and neural profiles. Although many studies using diverse methods have linked particular brain structures with specific basic emotions, evidence from individual neuroimaging studies and from neuroimaging meta-analyses has been inconclusive regarding whether basic emotions are associated with both consistent and discriminable regional brain activations. We revisited this question, using activation likelihood estimation (ALE), which allows spatially sensitive, voxelwise statistical comparison of results from multiple studies. ⋯ Each of the emotions examined (fear, anger, disgust, sadness, and happiness) was characterized by consistent neural correlates across studies, as defined by reliable correlations with regional brain activations. In addition, the activation patterns associated with each emotion were discrete (discriminable from the other emotions in pairwise contrasts) and overlapped substantially with structure-function correspondences identified using other approaches, providing converging evidence that discrete basic emotions have consistent and discriminable neural correlates. Complementing prior studies that have demonstrated neural correlates for the affective dimensions of arousal and valence, the current meta-analysis results indicate that the key elements of basic emotion views are reflected in neural correlates identified by neuroimaging studies.
-
Faces expressing fear may attract attention in an automatic bottom-up fashion. Here we address this issue with magneto-encephalographic (MEG) recordings in subjects performing a demanding visual search combined with the presentation of irrelevant neutral or fearful faces. The impact of the faces' emotional expression on attentional selection was assessed by analyzing the N2pc component--a modulation of the event-related magnetic field response known to reflect attentional focusing in visual search. ⋯ Behavioral performance was, however, influenced in a significant manner, suggesting that for behavioral effects to appear, sufficient attentional resources need to be left unoccupied by the search task--a notion put forward by the perceptual load theory. Our observations are taken to indicate that irrelevant fearful faces influence attentional processing in extrastriate visual cortex in an automatic fashion and independent of other task-relevant attentional operations. However, this may not necessarily be echoed at the behavioral level as long as task-relevant selection operations exhaust attentional resources.
-
When a single flash of light is presented interposed between two brief auditory stimuli separated by 60-100 msec, subjects typically report perceiving two flashes [Shams, L., Kamitani, Y., & Shimojo, S. Visual illusion induced by sound. Brain Research, Cognitive Brain Research, 14, 147-152, 2002; Shams, L., Kamitani, Y., & Shimojo, S. ⋯ The polarity of the early PD110/PD120 component did not differ as a function of the visual field (upper vs. lower) of stimulus presentation. This, along with the source localization of the component, suggested that its principal generator lies in extrastriate visual cortex. These results indicate that neural processes previously shown to be associated with the extra flash illusion can be modulated by attention, and thus are not the result of a wholly automatic cross-modal integration process.
-
Hippocampal activity is modulated during episodic memory retrieval. Most consistently, a relative increase in activity during confident retrieval is observed. Dorsolateral prefrontal cortex (DLPFC) is also activated during retrieval, but may be more generally activated during cognitive-control processes. ⋯ During the "recall-classify" task, anterior hippocampal activity was selectively reduced relative to "classify" and baseline tasks, and this activity was inversely correlated with DLPFC. Reaction time was positively correlated with DLPFC activation and default-network/hippocampal suppression. The findings demonstrate that frontal and hippocampal activity are dissociated during difficult episodic retrieval tasks and reveal important considerations for interpreting hippocampal activity associated with successful episodic retrieval.