Journal of cognitive neuroscience
-
Humans commonly understand the unobservable mental states of others by observing their actions. Embodied simulation theories suggest that this ability may be based in areas of the fronto-parietal mirror neuron system, yet neuroimaging studies that explicitly investigate the human ability to draw mental state inferences point to the involvement of a “mentalizing" system consisting of regions that do not overlap with the mirror neuron system. For the present study, we developed a novel action identification paradigm that allowed us to explicitly investigate the neural bases of mentalizing observed actions. ⋯ Although areas of the mirror neuron system did show an enhanced response during action identification, its activity was not significantly modulated by the extent to which the observers identified mental states. Instead, several regions of the mentalizing system, including dorsal and ventral aspects of medial pFC, posterior cingulate cortex, and temporal poles, were associated with mentalizing actions, whereas a single region in left lateral occipito-temporal cortex was associated with mechanizing actions. These data suggest that embodied simulation is insufficient to account for the sophisticated mentalizing that human beings are capable of while observing another and that a different system along the cortical midline and in anterior temporal cortex is involved in mentalizing an observed action.
-
Meta Analysis
Neuroimaging support for discrete neural correlates of basic emotions: a voxel-based meta-analysis.
What is the basic structure of emotional experience and how is it represented in the human brain? One highly influential theory, discrete basic emotions, proposes a limited set of basic emotions such as happiness and fear, which are characterized by unique physiological and neural profiles. Although many studies using diverse methods have linked particular brain structures with specific basic emotions, evidence from individual neuroimaging studies and from neuroimaging meta-analyses has been inconclusive regarding whether basic emotions are associated with both consistent and discriminable regional brain activations. We revisited this question, using activation likelihood estimation (ALE), which allows spatially sensitive, voxelwise statistical comparison of results from multiple studies. ⋯ Each of the emotions examined (fear, anger, disgust, sadness, and happiness) was characterized by consistent neural correlates across studies, as defined by reliable correlations with regional brain activations. In addition, the activation patterns associated with each emotion were discrete (discriminable from the other emotions in pairwise contrasts) and overlapped substantially with structure-function correspondences identified using other approaches, providing converging evidence that discrete basic emotions have consistent and discriminable neural correlates. Complementing prior studies that have demonstrated neural correlates for the affective dimensions of arousal and valence, the current meta-analysis results indicate that the key elements of basic emotion views are reflected in neural correlates identified by neuroimaging studies.
-
Faces expressing fear may attract attention in an automatic bottom-up fashion. Here we address this issue with magneto-encephalographic (MEG) recordings in subjects performing a demanding visual search combined with the presentation of irrelevant neutral or fearful faces. The impact of the faces' emotional expression on attentional selection was assessed by analyzing the N2pc component--a modulation of the event-related magnetic field response known to reflect attentional focusing in visual search. ⋯ Behavioral performance was, however, influenced in a significant manner, suggesting that for behavioral effects to appear, sufficient attentional resources need to be left unoccupied by the search task--a notion put forward by the perceptual load theory. Our observations are taken to indicate that irrelevant fearful faces influence attentional processing in extrastriate visual cortex in an automatic fashion and independent of other task-relevant attentional operations. However, this may not necessarily be echoed at the behavioral level as long as task-relevant selection operations exhaust attentional resources.
-
When a single flash of light is presented interposed between two brief auditory stimuli separated by 60-100 msec, subjects typically report perceiving two flashes [Shams, L., Kamitani, Y., & Shimojo, S. Visual illusion induced by sound. Brain Research, Cognitive Brain Research, 14, 147-152, 2002; Shams, L., Kamitani, Y., & Shimojo, S. ⋯ The polarity of the early PD110/PD120 component did not differ as a function of the visual field (upper vs. lower) of stimulus presentation. This, along with the source localization of the component, suggested that its principal generator lies in extrastriate visual cortex. These results indicate that neural processes previously shown to be associated with the extra flash illusion can be modulated by attention, and thus are not the result of a wholly automatic cross-modal integration process.
-
Hippocampal activity is modulated during episodic memory retrieval. Most consistently, a relative increase in activity during confident retrieval is observed. Dorsolateral prefrontal cortex (DLPFC) is also activated during retrieval, but may be more generally activated during cognitive-control processes. ⋯ During the "recall-classify" task, anterior hippocampal activity was selectively reduced relative to "classify" and baseline tasks, and this activity was inversely correlated with DLPFC. Reaction time was positively correlated with DLPFC activation and default-network/hippocampal suppression. The findings demonstrate that frontal and hippocampal activity are dissociated during difficult episodic retrieval tasks and reveal important considerations for interpreting hippocampal activity associated with successful episodic retrieval.