Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2001, Neuroimage
Considerable evidence indicates that processing facial expression involves both subcortical (amygdala and basal ganglia) and cortical (occipito-temporal, orbitofrontal, and prefrontal cortex) structures. However, the specificity of these regions for single types of emotions and for the cognitive demands of expression processing, is still unclear. This functional magnetic resonance imaging (fMRI) study investigated the neural correlates of incidental and explicit processing of the emotional content of faces expressing either disgust or happiness. Subjects were examined while they were viewing neutral, disgusted, or happy faces. The incidental task required subjects to decide about face gender, the explicit task to decide about face expression. In the control task subjects were requested to detect a white square in a greyscale mosaic stimulus. Results showed that the left inferior frontal cortex and the bilateral occipito-temporal junction responded equally to all face conditions. Several cortical and subcortical regions were modulated by task type, and by facial expression. Right neostriatum and left amygdala were activated when subjects made explicit judgements of disgust, bilateral orbitofrontal cortex when they made judgement of happiness, and right frontal and insular cortex when they made judgements about any emotion.
Cognitive Brain Research, 2001
We identified human brain regions involved in the perception of sad, frightened, happy, angry, and neutral facial expressions using functional magnetic resonance imaging (fMRI). Twenty-one healthy right-handed adult volunteers (11 men, 10 women; aged 18-45; mean age 21.6 years) participated in four separate runs, one for each of the four emotions. Participants viewed blocks of emotionally expressive faces alternating with blocks of neutral faces and scrambled images. In comparison with scrambled images, neutral faces activated the fusiform gyri, the right lateral occipital gyrus, the right superior temporal sulcus, the inferior frontal gyri, and the amygdala / entorhinal cortex. In comparisons of emotional and neutral faces, we found that (1) emotional faces elicit increased activation in a subset of cortical regions involved in neutral face processing and in areas not activated by neutral faces; (2) differences in activation as a function of emotion category were most evident in the frontal lobes; (3) men showed a differential neural response depending upon the emotion expressed but women did not.
Brain Research, 2011
Recognition and processing of emotional facial expression are crucial for social behavior and employ higher-order cognitive and visual working processes. In neuropsychiatric disorders, impaired emotion recognition most frequently concerned three specific emotions, i.e., anger, fear, and disgust. As incorrect processing of (neutral) facial stimuli per se might also underlie a v a i l a b l e a t w w w . s c i e n c e d i r e c t . c o m w w w . e l s e v i e r . c o m / l o c a t e / b r a i n r e s
2012
Degree of emotional valence and arousal have been shown to covary with blood oxygen level dependent (BOLD) signal levels in several brain structures. Here we studied brain activity in 17 healthy subjects during perception of facial expressions varying in valence and arousal using functional magnetic resonance imaging (fMRI). Our results revealed correlations with the perceived valence in dorsolateral and ventrolateral prefrontal cortex, dorsomedial prefrontal cortex, and anterior insula. These findings corroborate results of our previous study where we used pictures of varying valence taken from the International Affective Picture System (IAPS). Together, the results of these two studies suggest existence of common brain areas processing valence of both emotional pictures and facial expressions. Additionally, BOLD signal exhibited distinctive dependency on perceived valence in intraparietal sulcus and supramarginal gyrus in the present study. BOLD activity correlated with negative and positive valence in separate cortical areas, and some areas demonstrated either a U-shaped or an inverted U-shaped relationship with valence (i.e., either minimal or maximal activation was observed to neutral expressions). This nonlinear dependency suggests that brain mechanisms underlying perception of negative and positive valence are at least to some extent independent. Perceived arousal correlated positively with the strength of the BOLD signal only in the left inferior frontal gyrus, which is an important node of the mirror neuron system.
Human Brain …, 2000
The processing of changing nonverbal social signals such as facial expressions is poorly understood, and it is unknown if different pathways are activated during effortful (explicit), compared to implicit, processing of facial expressions. Thus we used fMRI to determine which brain areas subserve processing of high-valence expressions and if distinct brain areas are activated when facial expressions are processed explicitly or implicitly. Nine healthy volunteers were scanned (1.5T GE Signa with ANMR, TE/TR 40/3,000 ms) during two similar experiments in which blocks of mixed happy and angry facial expressions ("on" condition) were alternated with blocks of neutral faces (control "off" condition). Experiment 1 examined explicit processing of expressions by requiring subjects to attend to, and judge, facial expression. Experiment 2 examined implicit processing of expressions by requiring subjects to attend to, and judge, facial gender, which was counterbalanced in both experimental conditions. Processing of facial expressions significantly increased regional blood oxygenation level-dependent (BOLD) activity in fusiform and middle temporal gyri, hippocampus, amygdalohippocampal junction, and pulvinar nucleus. Explicit processing evoked significantly more activity in temporal lobe cortex than implicit processing, whereas implicit processing evoked significantly greater activity in amygdala region. Mixed high-valence facial expressions are processed within temporal lobe visual cortex, thalamus, and amygdalohippocampal complex. Also, neural substrates for explicit and implicit processing of facial expressions are dissociable: explicit processing activates temporal lobe cortex, whereas implicit processing activates amygdala region. Our findings confirm a neuroanatomical dissociation between conscious and unconscious processing of emotional information. Hum. Brain Mapping 9: 93-105, 2000.
Journal of Cognitive Neuroscience, 2001
& Some involvement of the human amygdala in the processing of facial expressions has been investigated in neuroimaging studies, although the neural mechanisms underlying motivated or emotional behavior in response to facial stimuli are not yet fully understood. We investigated, using functional magnetic resonance imaging (fMRI) and healthy volunteers, how the amygdala interacts with other cortical regions while subjects are judging the sex of faces with negative, positive, or neutral emotion. The data were analyzed by a subtractive method, then, to clarify possible interaction among regions within the brain, several kinds of analysis (i.e., a correlation analysis, a psychophysiological interaction analysis and a structural equation modeling) were performed. Overall, significant activation was observed in the bilateral fusiform gyrus, medial temporal lobe, prefrontal cortex, and the right parietal lobe during the task. The results of subtraction between the conditions showed that the left amygdala, right orbitofrontal cortex, and temporal cortices were predominantly involved in the processing of the negative expressions. The right angular gyrus was involved in the processing of the positive expressions when the negative condition was subtracted from the positive condition. The correlation analysis showed that activity in the left amygdala positively correlated with activity in the left prefrontal cortex under the negative minus neutral subtraction condition. The psychophysiological interaction revealed that the neural responses in the left amygdala and the right prefrontal cortex underwent the condition-specific changes between the negative and positive face conditions. The right amygdaloid activity also had an interactive effect with activity in the right hippocampus and middle temporal gyrus. These results may suggest that the left and right amygdalae play a differential role in effective processing of facial expressions in collaboration with other cortical or subcortical regions, with the left being related with the bilateral prefrontal cortex, and the right with the right temporal lobe. &
Psychiatry Research-neuroimaging, 1998
. We investigated facial recognition memory for previously unfamiliar faces and facial expression perception with Ž . functional magnetic resonance imaging fMRI . Eight healthy, right-handed volunteers participated. For the facial Ž . recognition task, subjects made a decision as to the familiarity of each of 50 faces 25 previously viewed; 25 novel . We detected signal increase in the right middle temporal gyrus and left prefrontal cortex during presentation of familiar faces, and in several brain regions, including bilateral posterior cingulate gyri, bilateral insulae and right middle occipital cortex during presentation of unfamiliar faces. Standard facial expressions of emotion were used as stimuli in two further tasks of facial expression perception. In the first task, subjects were presented with alternating happy and neutral faces; in the second task, subjects were presented with alternating sad and neutral faces. During presentation of happy facial expressions, we detected a signal increase predominantly in the left anterior cingulate gyrus, bilateral posterior cingulate gyri, medial frontal cortex and right supramarginal gyrus, brain regions previously implicated in visuospatial and emotion processing tasks. No brain regions showed increased signal intensity during presentation of sad facial expressions. These results provide evidence for a distinction between the neural correlates of facial recognition memory and perception of facial expression but, whilst highlighting the role of limbic structures in perception of happy facial expressions, do not allow the mapping of a distinct neural substrate for perception of sad facial expressions. ᮊ 0925-4927r98r$ -see front matter ᮊ 1998 Elsevier Science Ireland Ltd. All rights reserved.
Frontiers in Human Neuroscience, 2013
It is widely assumed that the fusiform face area (FFA), a brain region specialized for face perception, is not involved in processing emotional expressions. This assumption is based on the proposition that the FFA is involved in face identification and only processes features that are invariant across changes due to head movements, speaking and expressing emotions. The present study tested this proposition by examining whether the response in the human FFA varies across emotional expressions with functional magnetic resonance imaging and brain decoding analysis techniques (n = 11). A one vs. all classification analysis showed that most emotional expressions that participants perceived could be reliably predicted from the neural pattern of activity in left and the right FFA, suggesting that the perception of different emotional expressions recruit partially non-overlapping neural mechanisms. In addition, emotional expressions could also be decoded from the pattern of activity in the early visual cortex (EVC), indicating that retinotopic cortex also shows a differential response to emotional expressions. These results cast doubt on the idea that the FFA is involved in expression invariant face processing, and instead indicate that emotional expressions evoke partially de-correlated signals throughout occipital and posterior temporal cortex.
Social Cognitive and Affective Neuroscience, 2013
Facial expression perception can be influenced by the natural visual context in which the face is perceived. We performed an fMRI experiment presenting participants with fearful or neutral faces against threatening or neutral background scenes. Triangles and scrambled scenes served as control stimuli. The results showed that the valence of the background influences face selective activity in the right anterior parahippocampal place area (PPA) and subgenual anterior cingulate cortex (sgACC) with higher activation for a neutral backgrounds compared to threatening backgrounds (controlled for isolated background effects) and that this effect correlated with trait empathy in the sgACC. In addition, the left fusiform gyrus (FG) responds the affective congruence between face and background scene. The results show that valence of the background modulates face processing and support the hypothesis that empathic processing in sgACC is inhibited when affective information is present in the background. In addition, the findings reveal a pattern of complex scene perception showing a gradient of functional specialization along the posterior-anterior axis: from sensitivity to the affective content of scenes (extrastriate body area: EBA and posterior PPA), over scene emotion-face emotion interaction (left FG) via category-scene interaction (anterior PPA) to scene-category-personality interaction (sgACC).
Brain, 1999
Using functional neuroimaging, we tested two hypotheses. First, we tested whether the amygdala has a neural response to sad and/or angry facial expressions. Secondly, we tested whether the orbitofrontal cortex has a specific neural response to angry facial expressions. Volunteer subjects were scanned, using PET, while they performed a sex discrimination task involving static grey-scale images Abbreviations: ANCOVA ϭ analysis of covariance; rCBF ϭ regional cerebral blood flow; SPM ϭ statistical parametric mapping by guest on July 23, 2016 http://brain.oxfordjournals.org/ Downloaded from Broks P, Young AW, Maratos EJ, Coffey PJ, Calder AJ, Isaac CL, et al. Face processing impairments after encephalitis: amygdala damage and recognition of fear.
Neuroscience, 2008
Reading the facial expression of other people is a fundamental skill for social interaction. Human facial expressions of emotions are readily recognized but may also evoke the same experiential emotional state in the observer. We used event-related functional magnetic resonance imaging and multi-channel electroencephalography to determine in 14 right-handed healthy volunteers (29؎ 6 years) which brain structures mediate the perception of such a shared experiential emotional state. Statistical parametric mapping showed that an area in the dorsal medial frontal cortex was specifically activated during the perception of emotions that reflected the seen happy and sad emotional face expressions. This area mapped to the pre-supplementary motor area which plays a central role in control of behavior. Low resolution brain electromagnetic tomography-based analysis of the encephalographic data revealed that the activation was detected 100 ms after face presentation onset lasting until 740 ms. Our observation substantiates recently emerging evidence suggesting that the subjective perception of an experiential emotional state-empathy-is mediated by the involvement of the dorsal medial frontal cortex.
Neuroscience …, 2002
To examine the effect of gender on the volume and pattern of brain activation during the viewing of alternating sets of faces depicting happy or sad expressions, 24 volunteers, 12 men and 12 women, participated in this functional magnetic resonance imaging study. The experimental stimuli were 12 photographs of Japanese adults selected from Matsumoto and Ekman's Pictures of Facial Affect. Four of these pictures depicted happy facial emotions, four sad, and four neutral. Half of the photographs were of men and the other half were of women. Consistent with previous findings, distinct sets of neural correlates for processing happy and sad facial emotions were noted. Furthermore, it was observed that male and female subjects used a rather different set of neural correlates when processing faces showing either happy or sad expressions. This was more noticeable when they were processing faces portraying sad emotions than happy emotions. Our findings provide some preliminary support for the speculation that the two genders may be associated with different areas of brain activation during emotion recognition of happy or sad facial expressions. This suggests that the generalizability of findings in regard to neural correlates of facial emotion recognition should consider the gender of the subjects. q
European Journal of Neuroscience, 2006
Some authors consider contempt to be a basic emotion while others consider it a variant of disgust. The neural correlates of contempt have not so far been specifically contrasted with disgust. Using functional magnetic resonance imaging (fMRI), we investigated the neural networks involved in the processing of facial contempt and disgust in 24 healthy subjects. Facial recognition of contempt was lower than that of disgust and of neutral faces. The imaging data indicated significant activity in the amygdala and in globus pallidus and putamen during processing of contemptuous faces. Bilateral insula and caudate nuclei and left as well as right inferior frontal gyrus were engaged during processing of disgusted faces. Moreover, direct comparisons of contempt vs. disgust yielded significantly different activations in the amygdala. On the other hand, disgusted faces elicited greater activation than contemptuous faces in the right insula and caudate. Our findings suggest preferential involvement of different neural substrates in the processing of facial emotional expressions of contempt and disgust.
Neuropsychologia, 2007
Brain imaging studies in humans have shown that face processing in several areas is modulated by the affective significance of faces, particularly with fearful expressions, but also with other social signals such gaze direction. Here we review haemodynamic and electrical neuroimaging results indicating that activity in the face-selective fusiform cortex may be enhanced by emotional (fearful) expressions, without explicit voluntary control, and presumably through direct feedback connections from the amygdala. fMRI studies show that these increased responses in fusiform cortex to fearful faces are abolished by amygdala damage in the ipsilateral hemisphere, despite preserved effects of voluntary attention on fusiform; whereas emotional increases can still arise despite deficits in attention or awareness following parietal damage, and appear relatively unaffected by pharmacological increases in cholinergic stimulation. Fear-related modulations of face processing driven by amygdala signals may implicate not only fusiform cortex, but also earlier visual areas in occipital cortex (e.g., V1) and other distant regions involved in social, cognitive, or somatic responses (e.g., superior temporal sulcus, cingulate, or parietal areas). In the temporal domain, evoked-potentials show a widespread time-course of emotional face perception, with some increases in the amplitude of responses recorded over both occipital and frontal regions for fearful relative to neutral faces (as well as in the amygdala and orbitofrontal cortex, when using intracranial recordings), but with different latencies post-stimulus onset. Early emotional responses may arise around 120 ms, prior to a full visual categorization stage indexed by the face-selective N170 component, possibly reflecting rapid emotion processing based on crude visual cues in faces. Other electrical components arise at later latencies and involve more sustained activities, probably generated in associative or supramodal brain areas, and resulting in part from the modulatory signals received from amygdala. Altogether, these fMRI and ERP results demonstrate that emotion face perception is a complex process that cannot be related to a single neural event taking place in a single brain regions, but rather implicates an interactive network with distributed activity in time and space. Moreover, although traditional models in cognitive neuropsychology have often considered that facial expression and facial identity are processed along two separate pathways, evidence from fMRI and ERPs suggests instead that emotional processing can strongly affect brain systems responsible for face recognition and memory. The functional implications of these interactions remain to be fully explored, but might play an important role in the normal development of face processing skills and in some neuropsychiatric disorders.
Journal of Neurophysiology
We measured regional cerebral blood flow (rCBF) using positron emission tomography (PET) to determine which brain regions are involved in the assessment of facial emotion. We asked right-handed normal subjects to assess the signalers’ emotional state based on facial gestures and to assess the facial attractiveness, as well as to discriminate the background color of the facial stimuli, and compared the activity produced by each condition. The right inferior frontal cortex showed significant activation during the assessment of facial emotion in comparison with the other two tests. The activated area was located within a triangular area of the inferior frontal cortex in the right cerebral hemisphere. These results, together with those of previous imaging and clinical studies, suggest that the right inferior frontal cortex processes emotional communicative signals that could be visual or auditory and that there is a hemispheric asymmetry in the inferior frontal cortex in relation to the...
Proceedings of the National Academy of Sciences, 2008
The ability to perceive and differentiate facial expressions is vital for social communication. Numerous functional MRI (fMRI) studies in humans have shown enhanced responses to faces with different emotional valence, in both the amygdala and the visual cortex. However, relatively few studies have examined how valence influences neural responses in monkeys, thereby limiting the ability to draw comparisons across species and thus understand the underlying neural mechanisms. Here we tested the effects of macaque facial expressions on neural activation within these two regions using fMRI in three awake, behaving monkeys. Monkeys maintained central fixation while blocks of different monkey facial expressions were presented. Four different facial expressions were tested: (i) neutral, (ii) aggressive (open-mouthed threat), (iii) fearful (fear grin), and (iv) submissive (lip smack). Our results confirmed that both the amygdala and the inferior temporal cortex in monkeys are modulated by facial expressions. As in human fMRI, fearful expressions evoked the greatest response in monkeyseven though fearful expressions are physically dissimilar in humans and macaques. Furthermore, we found that valence effects were not uniformly distributed over the inferior temporal cortex. Surprisingly, these valence maps were independent of two related functional maps: (i) the map of "face-selective" regions (faces versus non-face objects) and (ii) the map of "face-responsive" regions (faces versus scrambled images). Thus, the neural mechanisms underlying face perception and valence perception appear to be distinct.
NeuroImage, 2000
There is debate in cognitive neuroscience whether conscious versus unconscious processing represents a categorical or a quantitative distinction. The purpose of the study was to explore this matter using functional magnetic resonance imaging (fMRI). We first established objective thresholds of the critical temporal parameters for overt and covert presentations of fear and disgust. Next we applied these stimulus parameters in an fMRI experiment to determine whether nonconsciously perceived (covert) facial expressions of fear and disgust show the same double dissociation (amygdala response to fear, insula to disgust) observed with consciously perceived (overt) stimuli. A backward masking paradigm was used. In the psychophysics experiment, the following parameters were established: 30-ms target duration for the covert condition, and 170-ms target duration for the overt condition. Results of the block-design fMRI study indicated substantial differences underlying the perception of fearful and disgusted facial expressions, with significant effects of both emotion and target duration. Findings for the overt condition (170 ms) confirm previous evidence of amygdala activation to fearful faces, and insula activation to disgusted faces, and a double dissociation between these two emotions. In the covert condition (30 ms), the amygdala was not activated to fear, nor was the insula activated to disgust. Overall, findings demonstrate significant differences between the neural responses to fear and to disgust, and between the covert presentations of these two emotions. These results therefore suggest distinct neural correlates of conscious and unconscious emotion perception. D 2004 Elsevier Inc. All rights reserved.
Neuroimage, 2004
Previous functional neuroimaging studies have demonstrated that the amygdala activates in response to fearful faces presented below the threshold of conscious visual perception. Using a backward masking procedure similar to that of previous studies, we used functional magnetic resonance imaging (fMRI) to study the amygdala and anterior cingulate gyrus during preattentive presentations of sad and happy facial affect. Twelve healthy adult females underwent blood oxygen level dependent (BOLD) fMRI while viewing sad and happy faces, each presented for 20 ms and ''masked'' immediately by a neutral face for 100 ms. Masked happy faces were associated with significant bilateral activation within the anterior cingulate gyrus and amygdala, whereas masked sadness yielded only limited activation within the left anterior cingulate gyrus. In a direct comparison, masked happy faces yielded significantly greater activation in the anterior cingulate and amygdala relative to identically masked sad faces. Conjunction analysis showed that masked affect perception, regardless of emotional valence, was associated with greater activation within the left amygdala and left anterior cingulate. Findings suggest that the amygdala and anterior cingulate are important components of a network involved in detecting and discriminating affective information presented below the normal threshold of conscious visual perception.
Cognitive Brain Research, 2001
A parallel neural network has been proposed for processing various types of information conveyed by faces including emotion. Using functional magnetic resonance imaging (fMRI), we tested the effect of the explicit attention to the emotional expression of the faces on the neuronal activity of the face-responsive regions. Delayed match to sample procedure was adopted. Subjects were required to match the visually presented pictures with regard to the contour of the face pictures, facial identity, and emotional expressions by valence (happy and fearful expressions) and arousal (fearful and sad expressions). Contour matching of the non-face scrambled pictures was used as a control condition. The face-responsive regions that responded more to faces than to non-face stimuli were the bilateral lateral fusiform gyrus (LFG), the right superior temporal sulcus (STS), and the bilateral intraparietal sulcus (IPS). In these regions, general attention to the face enhanced the activities of the bilateral LFG, the right STS, and the left IPS compared with attention to the contour of the facial image. Selective attention to facial emotion specifically enhanced the activity of the right STS compared with attention to the face per se. The results suggest that the right STS region plays a special role in facial emotion recognition within distributed face-processing systems. This finding may support the notion that the STS is involved in social perception.
Neuroreport, 2002
Human facial emotional expressions are complex. This may confound studies examining brain responses to these stimuli in control and clinical populations. However, several lines of evidence suggest that a few elementary facial features convey the gist of emotional expressions.Using fMRI, we assessed brain responses to line drawings of emotionally valenced (i.e. angry and happy) and neutral faces in healthy human subjects. Signi¢cantly increased fMRI signal was found in the amygdala, hippocampus and prefrontal cortex in response to emotional vs neutral schematic faces. Although direct comparisons of schematic and human faces will be needed, these initial results suggest that schematic faces may be useful for studying brain responses to emotional stimuli because of their simplicity relative to human faces. NeuroReport 13:785-790 c 2002 Lippincott Williams & Wilkins.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.