Journal of cognitive neuroscience
-
To understand the meanings of words and objects, we need to have knowledge about these items themselves plus executive mechanisms that compute and manipulate semantic information in a task-appropriate way. The neural basis for semantic control remains controversial. Neuroimaging studies have focused on the role of the left inferior frontal gyrus (LIFG), whereas neuropsychological research suggests that damage to a widely distributed network elicits impairments of semantic control. ⋯ The results revealed that LIFG and pMTG jointly support both the controlled retrieval and selection of semantic knowledge. IPS specifically participates in semantic selection and responds to manipulations of nonsemantic control demands. These observations are consistent with a large-scale semantic control network, as predicted by lesion data, that draws on semantic-specific (LIFG and pMTG) and domain-independent executive components (IPS).
-
Empathy is a critical aspect of human emotion that influences the behavior of individuals as well as the functioning of society. Although empathy is fundamentally a subjective experience, no studies have yet examined the neural correlates of the self-reported experience of empathy. Furthermore, although behavioral research has linked empathy to prosocial behavior, no work has yet connected empathy-related neural activity to everyday, real-world helping behavior. ⋯ Self-report of empathic experience and activity in empathy-related areas, notably MPFC, were higher in the empathize condition than in the load condition, suggesting that empathy is not a fully automatic experience. Additionally, high trait empathy participants displayed greater experienced empathy and stronger MPFC responses than low trait empathy individuals under cognitive load, suggesting that empathy is more automatic for individuals high in trait empathy. These results underline the critical role that MPFC plays in the instantiation of empathic experience and consequent behavior.
-
The vast majority of word meanings are learned simply by extracting them from context rather than by rote memorization or explicit instruction. Although this skill is remarkable, little is known about the brain mechanisms involved. In the present study, ERPs were recorded as participants read stories in which pseudowords were presented multiple times, embedded in consistent, meaningful contexts (referred to as meaning condition, M+) or inconsistent, meaningless contexts (M-). ⋯ In contrast, during the explicit recognition task, M+ words showed a robust N400 effect. The N400 effect was dependent upon recognition performance, such that only correctly recognized M+ words elicited an N400. This pattern of results provides evidence that the explicit representations of word meanings can develop rapidly, whereas implicit representations may require more extensive exposure or more time to emerge.
-
This study assessed the impact of serotonin transporter genotype (5-HTTLPR) on regional responses to emotional faces in the amygdala and subgenual cingulate cortex (sgACC), while subjects performed a gender discrimination task. Although we found no evidence for greater amygdala reactivity or reduced amygdala-sgACC coupling in short variant 5-HTTLPR homozygotes (s/s), we observed an interaction between genotype and emotion in sgACC. Only long variant homozygotes (la/la) exhibited subgenual deactivation to fearful versus neutral faces, whereas the effect in s/s subjects was in the other direction. ⋯ A., et al. Reciprocal limbic-cortical function and negative mood: Converging PET findings in depression and normal sadness. Am J Psychiatry, 156, 675-682, 1999].
-
The repetition of nociceptive stimuli of identical modality, intensity, and location at short and constant interstimulus intervals (ISIs) determines a strong habituation of the corresponding EEG responses, without affecting the subjective perception of pain. To understand what determines this response habituation, we (i) examined the effect of introducing a change in the modality of the repeated stimulus, and (ii) dissected the relative contribution of bottom-up, stimulus-driven changes in modality and top-down, cognitive expectations of such a change, on both laser-evoked and auditory-evoked EEG responses. Multichannel EEG was recorded while participants received trains of three stimuli (S1-S2-S3, a triplet) delivered to the hand dorsum at 1-sec ISI. ⋯ We found that introducing a change in stimulus modality produced a significant dishabituation of the laser-evoked N1, N2, and P2 waves; the auditory N1 and P2 waves; and the laser- and auditory-induced event-related synchronization and desynchronization. In contrast, the lack of explicit knowledge of a possible change in the sensory modality of the stimulus (i.e., uncertainty) only increased the ascending portion of the laser-evoked and auditory-evoked P2 wave. Altogether, these results indicate that bottom-up novelty resulting from the change of stimulus modality, and not top-down cognitive expectations, plays a major role in determining the habituation of these brain responses.