Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010, Social Cognitive and Affective Neuroscience
Facial expressions can trigger emotions: when we smile we feel happy, when we frown we feel sad. However, the mimicry literature also shows that we feel happy when our interaction partner behaves the way we do. Thus what happens if we express our sadness and we perceive somebody who is imitating us? In the current study, participants were presented with either happy or sad faces, while expressing one of these emotions themselves. Functional magnetic resonance imaging was used to measure neural responses on trials where the observed emotion was either congruent or incongruent with the expressed emotion. Our results indicate that being in a congruent emotional state, irrespective of the emotion, activates the medial orbitofrontal cortex and ventromedial prefrontal cortex, brain areas that have been associated with positive feelings and reward processing. However, incongruent emotional states activated the dorsolateral prefrontal cortex as well as posterior superior temporal gyrus/sulcus, both playing a role in conflict processing.
Neuroscience, 2008
Reading the facial expression of other people is a fundamental skill for social interaction. Human facial expressions of emotions are readily recognized but may also evoke the same experiential emotional state in the observer. We used event-related functional magnetic resonance imaging and multi-channel electroencephalography to determine in 14 right-handed healthy volunteers (29؎ 6 years) which brain structures mediate the perception of such a shared experiential emotional state. Statistical parametric mapping showed that an area in the dorsal medial frontal cortex was specifically activated during the perception of emotions that reflected the seen happy and sad emotional face expressions. This area mapped to the pre-supplementary motor area which plays a central role in control of behavior. Low resolution brain electromagnetic tomography-based analysis of the encephalographic data revealed that the activation was detected 100 ms after face presentation onset lasting until 740 ms. Our observation substantiates recently emerging evidence suggesting that the subjective perception of an experiential emotional state-empathy-is mediated by the involvement of the dorsal medial frontal cortex.
Psychiatry Research: Neuroimaging, 2003
In human communication there is often a close relationship between the perception of an emotionally expressive face and the facial response of the viewer himself. Whereas perception and generation of facial expressions have been studied separately with functional imaging methods, no studies exist on their interaction. We combined the presentation of emotionally expressive faces with the instruction to react with facial movements predetermined and assigned. fMRI was used in an event related design to examine healthy subjects while they regarded happy, sad, or neutral faces and were instructed to simultaneously move the corners of their mouths either (a) upwards or (b) downwards, or (c) to refrain from movement. The subjects' facial movements were recorded with an MR-compatible video camera. Movement latencies were shortened in congruent situations (e.g. the presentation of a happy face and combined with upward movements) and delayed in non-congruent situations. Dissonant more than congruent stimuli activated the inferior prefrontal cortex and the somatomotor cortex bilaterally. The congruent condition, in particular when seeing a happy face, activated the medial basotemporal lobes (hippocampus, amygdala, parahippocampal region). We hypothesize that this region facilitates congruent facial movements when an emotionally expressive face is perceived and that it is part of a system for non-volitional emotional facial movements. ᮊ
2012
Numerous studies have shown that humans automatically react with congruent facial reactions, i.e., facial mimicry, when seeing a vis-á-vis' facial expressions. The current experiment is the first investigating the neuronal structures responsible for differences in the occurrence of such facial mimicry reactions by simultaneously measuring BOLD and facial EMG in an MRI scanner. Therefore, 20 female students viewed emotional facial expressions (happy, sad, and angry) of male and female avatar characters. During picture presentation, the BOLD signal as well as M. zygomaticus major and M. corrugator supercilii activity were recorded simultaneously. Results show prototypical patterns of facial mimicry after correction for MR-related artifacts: enhanced M. zygomaticus major activity in response to happy and enhanced M. corrugator supercilii activity in response to sad and angry expressions. Regression analyses show that these congruent facial reactions correlate significantly with activations in the IFG, SMA, and cerebellum. Stronger zygomaticus reactions to happy faces were further associated to increased activities in the caudate, MTG, and PCC. Corrugator reactions to angry expressions were further correlated with the hippocampus, insula, and STS. Results are discussed in relation to core and extended models of the mirror neuron system (MNS).
2012
Degree of emotional valence and arousal have been shown to covary with blood oxygen level dependent (BOLD) signal levels in several brain structures. Here we studied brain activity in 17 healthy subjects during perception of facial expressions varying in valence and arousal using functional magnetic resonance imaging (fMRI). Our results revealed correlations with the perceived valence in dorsolateral and ventrolateral prefrontal cortex, dorsomedial prefrontal cortex, and anterior insula. These findings corroborate results of our previous study where we used pictures of varying valence taken from the International Affective Picture System (IAPS). Together, the results of these two studies suggest existence of common brain areas processing valence of both emotional pictures and facial expressions. Additionally, BOLD signal exhibited distinctive dependency on perceived valence in intraparietal sulcus and supramarginal gyrus in the present study. BOLD activity correlated with negative and positive valence in separate cortical areas, and some areas demonstrated either a U-shaped or an inverted U-shaped relationship with valence (i.e., either minimal or maximal activation was observed to neutral expressions). This nonlinear dependency suggests that brain mechanisms underlying perception of negative and positive valence are at least to some extent independent. Perceived arousal correlated positively with the strength of the BOLD signal only in the left inferior frontal gyrus, which is an important node of the mirror neuron system.
Neuroscience Letters, 2012
Proceedings of the National Academy of Sciences, 2012
Sharing others’ emotional states may facilitate understanding their intentions and actions. Here we show that networks of brain areas “tick together” in participants who are viewing similar emotional events in a movie. Participants’ brain activity was measured with functional MRI while they watched movies depicting unpleasant, neutral, and pleasant emotions. After scanning, participants watched the movies again and continuously rated their experience of pleasantness–unpleasantness (i.e., valence) and of arousal–calmness. Pearson’s correlation coefficient was used to derive multisubject voxelwise similarity measures [intersubject correlations (ISCs)] of functional MRI data. Valence and arousal time series were used to predict the moment-to-moment ISCs computed using a 17-s moving average. During movie viewing, participants' brain activity was synchronized in lower- and higher-order sensory areas and in corticolimbic emotion circuits. Negative valence was associated with increased...
Brain, 1999
Using functional neuroimaging, we tested two hypotheses. First, we tested whether the amygdala has a neural response to sad and/or angry facial expressions. Secondly, we tested whether the orbitofrontal cortex has a specific neural response to angry facial expressions. Volunteer subjects were scanned, using PET, while they performed a sex discrimination task involving static grey-scale images Abbreviations: ANCOVA ϭ analysis of covariance; rCBF ϭ regional cerebral blood flow; SPM ϭ statistical parametric mapping by guest on July 23, 2016 http://brain.oxfordjournals.org/ Downloaded from Broks P, Young AW, Maratos EJ, Coffey PJ, Calder AJ, Isaac CL, et al. Face processing impairments after encephalitis: amygdala damage and recognition of fear.
Philosophical Transactions of The Royal Society B: Biological Sciences, 2009
Why do we feel tears well up when we see a loved one cry? Why do we wince when we see other people hurt themselves? This review addresses these questions from the perspective of embodied simulation: observing the actions and tactile sensations of others activates premotor, posterior parietal and somatosensory regions in the brain of the observer which are also active when performing similar movements and feeling similar sensations. We will show that seeing the emotions of others also recruits regions involved in experiencing similar emotions, although there does not seem to be a reliable mapping of particular emotions onto particular brain regions. Instead, emotion simulation seems to involve a mosaic of affective, motor and somatosensory components. The relative contributions of these components to a particular emotion and their interrelationship are largely unknown, although recent experimental evidence suggests that motor simulation may be a trigger for the simulation of associated feeling states. This mosaic of simulations may be necessary for generating the compelling insights we have into the feelings of others. Through their integration with, and modulation by, higher cognitive functions, they could be at the core of important social functions, including empathy, mind reading and social learning.
Social Cognitive and Affective Neuroscience, 2011
Static pictures of emotional facial expressions have been found to activate brain structures involved in the processing of emotional stimuli. However, in everyday live, emotional expressions are changing rapidly, and the processing of the onset vs the offset of the very same emotional expression might rely on different brain networks, presumably leading to different behavioral and physiological reactions (e.g. approach or avoidance). Using functional magnetic resonance imaging, this was examined by presenting video clips depicting onsets and offsets of happy and angry facial expressions. Subjective valence and threat ratings clearly depended on the direction of change. Blood oxygen level dependent responses indicate both reward-and threat-related activations for the offset of angry expressions. Comparing onsets and offsets, angry offsets were associated with stronger ventral striatum activation than angry onsets. Additionally, the offset of happy and the onset of angry expressions showed strong common activity in the lateral orbitofrontal cortex bilaterally, the left amygdala and the left insula, whereas the onset of happy and the offset of angry expressions induced significant activation in the left dorsal striatum. In sum, the results confirm different activity in motivation-related brain areas in response to the onset and offset of the same emotional expression and highlight the importance of temporal characteristics of facial expressions for social communication.
The emotional matching paradigm, introduced by Hariri and colleagues in 2000, is a widely used neuroimaging experiment that reliably activates the amygdala. In the classic version of the experiment faces with negative emotional expression and scenes depicting distressing events are compared with geometric shapes instead of neutral stimuli of the same category (i.e. faces or scenes). This makes it difficult to clearly attribute amygdala activation to the emotional valence and not to the social content. To improve this paradigm, we conducted a functional magnetic resonance imaging study in which emotionally neutral and, additionally, positive stimuli within each stimulus category (i.e. faces, social and non-social scenes) were included. These categories enabled us to differentiate the exact nature of observed effects in the amygdala. First, the main findings of the original paradigm were replicated. Second, we observed amygdala activation when comparing negative to neutral stimuli of the same category. However, for negative faces, the amygdala response habituated rapidly. Third, positive stimuli were associated with widespread activation including the insula and the caudate. This validated adaption study enables more precise statements on the neural activation underlying emotional processing. These advances may benefit future studies on identifying selective impairments in emotional and social stimulus processing. Amygdala functioning is of high interest for clinical psychology, psychiatry and neuroscience, as heightened amygdala activation has been reported in various patient groups 1-5. The emotional matching paradigm by Hariri et al. 6 and its extended version 7 are widely used as emotional reactivity measures, which reliably activate the amygdala 8-10. Despite its current use in psychiatry, this paradigm has a potential drawback since faces with negative emotional expressions and negative social scenes are compared with simple geometric shapes. Thus, it compares pictures that differ in more than one domain: social content and emotional valence. It is therefore difficult to draw conclusions about which of the two different domains causes the increase in amygdala activation. This differentiation may arguably not be relevant for all purposes, but to study specific populations, such as patients with deficits in one or the other domain (e.g. those with autism spectrum disorder (ASD)) 11,12 , it is crucial to distinguish the two. A second issue is that negative emotions have been studied more widely than positive emotions, as exemplified by the original emotional matching paradigm, putatively due to their high functional value for action. For example previous research suggests that threatening scenes, whether they contained faces or no human features, elicited activation in the extrastriate body area, suggesting that this activity to threatening scenes, represents the capacity of the brain to associate certain situations with threat, in order to prepare for fast reactions 13. Positive emotions are, however, the other side of the coin, as they allow psychological growth and well-being 14. Positive stimuli are most commonly used in the context of reward experiments, for example in performance-based feedback tasks 15,16. A brain region that has been related to the processing of various reward types, ranging from primary reinforcers to more abstract social rewards 17,18 , is the ventral striatum 19. Also, meta-analytically, the ventral striatum elicits the strongest activation across the different reward types such as monetary, food and erotic rewards 20. The amygdala was also found to be activated in response to positive stimuli. For direct comparisons of positive and negative faces, not all studies found amygdala activation differences 21 , but a meta-analysis of 1
Social cognitive and affective neuroscience, 2006
Intentionally adopting a discrete emotional facial expression can modulate the subjective feelings corresponding to that emotion; however, the underlying neural mechanism is poorly understood. We therefore used functional brain imaging (functional magnetic resonance imaging) to examine brain activity during intentional mimicry of emotional and non-emotional facial expressions and relate regional responses to the magnitude of expression-induced facial movement. Eighteen healthy subjects were scanned while imitating video clips depicting three emotional (sad, angry, happy), and two 'ingestive' (chewing and licking) facial expressions. Simultaneously, facial movement was monitored from displacement of fiducial markers (highly reflective dots) on each subject's face. Imitating emotional expressions enhanced activity within right inferior prefrontal cortex. This pattern was absent during passive viewing conditions. Moreover, the magnitude of facial movement during emotion-imi...
Frontiers in Psychology
Real-life faces are dynamic by nature, particularly when expressing emotion. Increasing evidence suggests that the perception of dynamic displays enhances facial mimicry and induces activation in widespread brain structures considered to be part of the mirror neuron system, a neuronal network linked to empathy. The present study is the first to investigate the relations among facial muscle responses, brain activity, and empathy traits while participants observed static and dynamic (videos) facial expressions of fear and disgust. During display presentation, blood-oxygen level-dependent (BOLD) signal as well as muscle reactions of the corrugator supercilii and levator labii were recorded simultaneously from 46 healthy individuals (21 females). It was shown that both fear and disgust faces caused activity in the corrugator supercilii muscle, while perception of disgust produced facial activity additionally in the levator labii muscle, supporting a specific pattern of facial mimicry for these emotions. Moreover, individuals with higher, compared to individuals with lower, empathy traits showed greater activity in the corrugator supercilii and levator labii muscles; however, these responses were not differentiable between static and dynamic mode. Conversely, neuroimaging data revealed motion and emotional-related brain structures in response to dynamic rather than static stimuli among high empathy individuals. In line with this, there was a correlation between electromyography (EMG) responses and brain activity suggesting that the Mirror Neuron System, the anterior insula and the amygdala might constitute the neural correlates of automatic facial mimicry for fear and disgust. These results revealed that the dynamic property of (emotional) stimuli facilitates the emotional-related processing of facial expressions, especially among whose with high trait empathy.
Social cognitive and affective neuroscience, 2015
Talking about emotion and sharing emotional experiences is a key component of human interaction. Specifically, individuals often consider the reactions of other people when evaluating the meaning and impact of an emotional stimulus. It has not yet been investigated, however, how emotional arousal ratings and physiological responses elicited by affective stimuli are influenced by the rating of an interaction partner. In the present study, pairs of participants were asked to rate and communicate the degree of their emotional arousal while viewing affective pictures. Strikingly, participants adjusted their arousal ratings to match up with their interaction partner. In anticipation of the affective picture, the interaction partner's arousal ratings correlated positively with activity in anterior insula and prefrontal cortex. During picture presentation, social influence was reflected in the ventral striatum, that is, activity in the ventral striatum correlated negatively with the in...
Cerebral Cortex, 2007
Emotional facial expressions can engender similar expressions in others. However, adaptive social and motivational behavior can require individuals to suppress, conceal, or override prepotent imitative responses. We predicted, in line with a theory of ''emotion contagion,'' that when viewing a facial expression, expressing a different emotion would manifest as behavioral conflict and interference. We employed facial electromyography (EMG) and functional magnetic resonance imaging (fMRI) to investigate brain activity related to this emotion expression interference (EEI) effect, where the expressed response was either concordant or discordant with the observed emotion. The Simon task was included as a nonemotional comparison for the fMRI study. Facilitation and interference effects were observed in the latency of facial EMG responses. Neuroimaging revealed activation of distributed brain regions including anterior right inferior frontal gyrus (brain area [BA] 47), supplementary motor area (facial area), posterior superior temporal sulcus (STS), and right anterior insula during emotion expression-associated interference. In contrast, nonemotional response conflict (Simon task) engaged a distinct frontostriatal network. Individual differences in empathy and emotion regulatory tendency predicted the magnitude of EEI-evoked regional activity with BA 47 and STS. Our findings point to these regions as providing a putative neural substrate underpinning a crucial adaptive aspect of social/emotional behavior.
People commonly communicate emotional states through facial expressions. However, existing neuroimaging research focuses almost entirely on brain systems involved in perceiving expressions, leaving unclear whether similar systems are recruited when people generate expressions. Pairs of friends took turns viewing positive and neutral images while undergoing simultaneous fMRI scanning and EMG recording of zygomaticus major, a facial muscle associated with smiling. Participants were instructed that they were either visible to their friend or not visible during image-viewing. When participants viewed positive images, their EMG responses parametrically tracked activity in brain structures associated with experiencing emotion, including ventral striatum, caudate, insula, and anterior cingulate cortex. When further instructed that they were visible to their friend, participants’ EMG responses also tracked activity in structures associated with mentalizing, including temporoparietal junctio...
Frontiers in Psychology - Emotion Science, 2013
Emotion regulation is crucial for successfully engaging in social interactions. Yet, little is known about the neural mechanisms controlling behavioral responses to emotional expressions perceived in the face of other people, which constitute a key element of interpersonal communication. Here, we investigated brain systems involved in social emotion perception and regulation, using functional magnetic resonance imaging (fMRI) in 20 healthy participants who saw dynamic facial expressions of either happiness or sadness, and were asked to either imitate the expression or to suppress any expression on their own face (in addition to a gender judgment control task). fMRI results revealed higher activity in regions associated with emotion (e.g., the insula), motor function (e.g., motor cortex), and theory of mind during imitation. Activity in dorsal cingulate cortex was also increased during imitation, possibly reflecting greater action monitoring or conflict with own feeling states. In addition, premotor regions were more strongly activated during both imitation and suppression, suggesting a recruitment of motor control for both the production and inhibition of emotion expressions. Expressive suppression produced increases in dorsolateral and lateral prefrontal cortex typically related to cognitive control. These results suggest that voluntary imitation and expressive suppression modulate brain responses to emotional signals perceived from faces, by up- and down-regulating activity in distributed subcortical and cortical networks that are particularly involved in emotion, action monitoring, and cognitive control.
NeuroImage, 2014
This study investigated the emotional effects and neural correlates of being empathized with while speaking about a currently experienced real-life social conflict during fMRI. Specifically, we focused on the effects of cognitive empathy in the form of paraphrasing, a technique regularly used in conflict resolution. 22 participants underwent fMRI while being interviewed on their social conflict and receiving empathic or unempathic responses from the interviewer. Skin conductance response (SCR) and self-report ratings of feeling understood and emotional valence were used to assess emotional responses. Results confirm previous findings indicating that cognitive empathy exerts a positive short-term effect on emotions in social conflict, while at the same time increasing autonomic arousal reflected by SCR. Effects of paraphrasing and unempathic interventions as indicated by self-report ratings varied depending on self-esteem, pre-interview negative affect, and participants' empathy quotient. Empathic responses engaged a fronto-parietal network with activity in the right precentral gyrus (PrG), left middle frontal gyrus (MFG), left inferior parietal gyrus (IPG), and right postcentral gyrus (PoG). Processing unempathic responses involved a fronto-temporal network with clusters peaking in the left inferior frontal gyrus, pars triangularis (IFGTr), and right temporal pole (TP). A specific modeling of feeling misunderstood activated a network consisting of the IFG, left TP, left Heschl gyrus, IFGTr, and right precuneus, extending to several limbic regions, such as the insula, amygdala, putamen, and anterior cingulate cortex/right middle cingulum (ACC/MCC). The results support the effectiveness of a widely used conflict resolution technique, which may also be useful for professionals who regularly deal with and have to de-escalate situations highly charged with negative emotion, e.g. physicians or judges.
Human Brain …, 2000
The processing of changing nonverbal social signals such as facial expressions is poorly understood, and it is unknown if different pathways are activated during effortful (explicit), compared to implicit, processing of facial expressions. Thus we used fMRI to determine which brain areas subserve processing of high-valence expressions and if distinct brain areas are activated when facial expressions are processed explicitly or implicitly. Nine healthy volunteers were scanned (1.5T GE Signa with ANMR, TE/TR 40/3,000 ms) during two similar experiments in which blocks of mixed happy and angry facial expressions ("on" condition) were alternated with blocks of neutral faces (control "off" condition). Experiment 1 examined explicit processing of expressions by requiring subjects to attend to, and judge, facial expression. Experiment 2 examined implicit processing of expressions by requiring subjects to attend to, and judge, facial gender, which was counterbalanced in both experimental conditions. Processing of facial expressions significantly increased regional blood oxygenation level-dependent (BOLD) activity in fusiform and middle temporal gyri, hippocampus, amygdalohippocampal junction, and pulvinar nucleus. Explicit processing evoked significantly more activity in temporal lobe cortex than implicit processing, whereas implicit processing evoked significantly greater activity in amygdala region. Mixed high-valence facial expressions are processed within temporal lobe visual cortex, thalamus, and amygdalohippocampal complex. Also, neural substrates for explicit and implicit processing of facial expressions are dissociable: explicit processing activates temporal lobe cortex, whereas implicit processing activates amygdala region. Our findings confirm a neuroanatomical dissociation between conscious and unconscious processing of emotional information. Hum. Brain Mapping 9: 93-105, 2000.
Emotion (Washington, D.C.), 2012
We aimed at verifying the hypothesis that facial mimicry is causally and selectively involved in emotion recognition. For this purpose, in Experiment 1, we explored the effect of tonic contraction of muscles in upper or lower half of participants' face on their ability to recognize emotional facial expressions. We found that the "lower" manipulation specifically impaired recognition of happiness and disgust, the "upper" manipulation impaired recognition of anger, while both manipulations affected recognition of fear; recognition of surprise and sadness were not affected by either blocking manipulations. In Experiment 2, we verified whether emotion recognition is hampered by stimuli in which an upper or lower half-face showing an emotional expression is combined with a neutral half-face. We found that the neutral lower half-face interfered with recognition of happiness and disgust, whereas the neutral upper half impaired recognition of anger; recognition of fear and sadness was impaired by both manipulations, whereas recognition of surprise was not affected by either manipulation. Taken together, the present findings support simulation models of emotion recognition and provide insight into the role of mimicry in comprehension of others' emotional facial expressions.
NeuroImage, 2008
Empathy allows us to simulate others' affective and cognitive mental states internally, and it has been proposed that the mirroring or motor representation systems play a key role in such simulation. As emotions are related to important adaptive events linked with benefit or danger, simulating others' emotional states might constitute of a special case of empathy. In this functional magnetic resonance imaging (fMRI) study we tested if emotional versus cognitive empathy would facilitate the recruitment of brain networks involved in motor representation and imitation in healthy volunteers. Participants were presented with photographs depicting people in neutral everyday situations (cognitive empathy blocks), or suffering serious threat or harm (emotional empathy blocks). Participants were instructed to empathize with specified persons depicted in the scenes. Emotional versus cognitive empathy resulted in increased activity in limbic areas involved in emotion processing (thalamus), and also in cortical areas involved in face (fusiform gyrus) and body perception, as well as in networks associated with mirroring of others' actions (inferior parietal lobule). When brain activation resulting from viewing the scenes was controlled, emotional empathy still engaged the mirror neuron system (premotor cortex) more than cognitive empathy. Further, thalamus and primary somatosensory and motor cortices showed increased functional coupling during emotional versus cognitive empathy. The results suggest that emotional empathy is special. Emotional empathy facilitates somatic, sensory, and motor representation of other peoples' mental states, and results in more vigorous mirroring of the observed mental and bodily states than cognitive empathy.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.