Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2015, Social cognitive and affective neuroscience
Talking about emotion and sharing emotional experiences is a key component of human interaction. Specifically, individuals often consider the reactions of other people when evaluating the meaning and impact of an emotional stimulus. It has not yet been investigated, however, how emotional arousal ratings and physiological responses elicited by affective stimuli are influenced by the rating of an interaction partner. In the present study, pairs of participants were asked to rate and communicate the degree of their emotional arousal while viewing affective pictures. Strikingly, participants adjusted their arousal ratings to match up with their interaction partner. In anticipation of the affective picture, the interaction partner's arousal ratings correlated positively with activity in anterior insula and prefrontal cortex. During picture presentation, social influence was reflected in the ventral striatum, that is, activity in the ventral striatum correlated negatively with the in...
Proceedings of the National Academy of Sciences, 2012
Sharing others’ emotional states may facilitate understanding their intentions and actions. Here we show that networks of brain areas “tick together” in participants who are viewing similar emotional events in a movie. Participants’ brain activity was measured with functional MRI while they watched movies depicting unpleasant, neutral, and pleasant emotions. After scanning, participants watched the movies again and continuously rated their experience of pleasantness–unpleasantness (i.e., valence) and of arousal–calmness. Pearson’s correlation coefficient was used to derive multisubject voxelwise similarity measures [intersubject correlations (ISCs)] of functional MRI data. Valence and arousal time series were used to predict the moment-to-moment ISCs computed using a 17-s moving average. During movie viewing, participants' brain activity was synchronized in lower- and higher-order sensory areas and in corticolimbic emotion circuits. Negative valence was associated with increased...
Social Cognitive and Affective Neuroscience, 2017
Emotional experiences are frequently shaped by the emotional responses of co-present others. Research has shown that people constantly monitor and adapt to the incoming social-emotional signals, even without face-to-face interaction. And yet, the neural processes underlying such emotional transmissions have not been directly studied. Here, we investigated how the human brain processes emotional cues which arrive from another, co-attending individual. We presented continuous emotional feedback to participants who viewed a movie in the scanner. Participants in the social group (but not in the control group) believed that the feedback was coming from another person who was co-viewing the same movie. We found that social-emotional feedback significantly affected the neural dynamics both in the core affect and in the medial prefrontal regions. Specifically, the response time-courses in those regions exhibited increased similarity across recipients and increased neural alignment with the timeline of the feedback in the social compared with control group. Taken in conjunction with previous research, this study suggests that emotional cues from others shape the neural dynamics across the whole neural continuum of emotional processing in the brain. Moreover, it demonstrates that interpersonal neural alignment can serve as a neural mechanism through which affective information is conveyed between individuals.
Journal of Cognitive Neuroscience, 2004
D
Social Cognitive and Affective Neuroscience, 2010
Facial expressions can trigger emotions: when we smile we feel happy, when we frown we feel sad. However, the mimicry literature also shows that we feel happy when our interaction partner behaves the way we do. Thus what happens if we express our sadness and we perceive somebody who is imitating us? In the current study, participants were presented with either happy or sad faces, while expressing one of these emotions themselves. Functional magnetic resonance imaging was used to measure neural responses on trials where the observed emotion was either congruent or incongruent with the expressed emotion. Our results indicate that being in a congruent emotional state, irrespective of the emotion, activates the medial orbitofrontal cortex and ventromedial prefrontal cortex, brain areas that have been associated with positive feelings and reward processing. However, incongruent emotional states activated the dorsolateral prefrontal cortex as well as posterior superior temporal gyrus/sulcus, both playing a role in conflict processing.
Human Brain Mapping, 2019
Individuals often align their emotional states during conversation. Here, we reveal how such emotional alignment is reflected in synchronization of brain activity across speakers and listeners. Two "speaker" subjects told emotional and neutral autobiographical stories while their hemodynamic brain activity was measured with functional magnetic resonance imaging (fMRI). The stories were recorded and played back to 16 "listener" subjects during fMRI. After scanning, both speakers and listeners rated the moment-to-moment valence and arousal of the stories. Time-varying similarity of the blood-oxygenation-level-dependent (BOLD) time series was quantified by intersubject phase synchronization (ISPS) between speaker-listener pairs. Telling and listening to the stories elicited similar emotions across speaker-listener pairs. Arousal was associated with increased speaker-listener neural synchronization in brain regions supporting attentional, auditory, somatosensory, and motor processing. Valence was associated with increased speaker-listener neural synchronization in brain regions involved in emotional processing, including amygdala, hippocampus, and temporal pole. Speaker-listener synchronization of subjective feelings of arousal was associated with increased neural synchronization in somatosensory and subcortical brain regions; synchronization of valence was associated with neural synchronization in parietal cortices and midline structures. We propose that emotion-dependent speaker-listener neural synchronization is associated with emotional contagion, thereby implying that listeners reproduce some aspects of the speaker's emotional state at the neural level.
Revista Latinoamericana De Estudios Sobre Cuerpos Emociones Y Sociedad, 2014
Neurosociology is a new approach aimed at integrating social and biological sciences. In this paper, first we used Alan Fiske's theory (1992) of elementary forms of social relationships as a nexus between sociological studies of groups and group-based emotions and relevant neuroscientific findings. Then, we identified types of social situations that generate basic emotions (happiness, anger, sadness, and fear) within particular relationships. Individuals participate differently in these situations. Therefore, they are expected to differ in their emotions and cognitions, as well as in their underlying neural activity. Finally, we considered social affiliation and social hierarchy corresponding to communal sharing and authority ranking social relationships to demonstrate the logic of neurosociological research.
Cerebral Cortex, 2007
Emotional facial expressions can engender similar expressions in others. However, adaptive social and motivational behavior can require individuals to suppress, conceal, or override prepotent imitative responses. We predicted, in line with a theory of ''emotion contagion,'' that when viewing a facial expression, expressing a different emotion would manifest as behavioral conflict and interference. We employed facial electromyography (EMG) and functional magnetic resonance imaging (fMRI) to investigate brain activity related to this emotion expression interference (EEI) effect, where the expressed response was either concordant or discordant with the observed emotion. The Simon task was included as a nonemotional comparison for the fMRI study. Facilitation and interference effects were observed in the latency of facial EMG responses. Neuroimaging revealed activation of distributed brain regions including anterior right inferior frontal gyrus (brain area [BA] 47), supplementary motor area (facial area), posterior superior temporal sulcus (STS), and right anterior insula during emotion expression-associated interference. In contrast, nonemotional response conflict (Simon task) engaged a distinct frontostriatal network. Individual differences in empathy and emotion regulatory tendency predicted the magnitude of EEI-evoked regional activity with BA 47 and STS. Our findings point to these regions as providing a putative neural substrate underpinning a crucial adaptive aspect of social/emotional behavior.
PLoS ONE, 2011
The timing and neural processing of the understanding of social interactions was investigated by presenting scenes in which 2 people performed cooperative or affective actions. While the role of the human mirror neuron system (MNS) in understanding actions and intentions is widely accepted, little is known about the time course within which these aspects of visual information are automatically extracted. Event-Related Potentials were recorded in 35 university students perceiving 260 pictures of cooperative (e.g., 2 people dragging a box) or affective (e.g., 2 people smiling and holding hands) interactions. The action's goal was automatically discriminated at about 150-170 ms, as reflected by occipito/temporal N170 response. The swLORETA inverse solution revealed the strongest sources in the right posterior cingulate cortex (CC) for affective actions and in the right pSTS for cooperative actions. It was found a right hemispheric asymmetry that involved the fusiform gyrus (BA37), the posterior CC, and the medial frontal gyrus (BA10/11) for the processing of affective interactions, particularly in the 155-175 ms time window. In a later time window (200-250 ms) the processing of cooperative interactions activated the left post-central gyrus (BA3), the left parahippocampal gyrus, the left superior frontal gyrus (BA10), as well as the right premotor cortex (BA6). Women showed a greater response discriminative of the action's goal compared to men at P300 and anterior negativity level (220-500 ms). These findings might be related to a greater responsiveness of the female vs. male MNS. In addition, the discriminative effect was bilateral in women and was smaller and left-sided in men. Evidence was provided that perceptually similar social interactions are discriminated on the basis of the agents' intentions quite early in neural processing, differentially activating regions devoted to face/body/action coding, the limbic system and the MNS. Citation: Proverbio AM, Riva F, Paganelli L, Cappa SF, Canessa N, et al. (2011) Neural Coding of Cooperative vs. Affective Human Interactions: 150 ms to Code the Action's Purpose. PLoS ONE 6(7): e22026.
There is ample evidence that human primates strive for social contact and experience interactions with conspecifics as intrinsically rewarding. Focusing on gaze behavior as a crucial means of human interaction, this study employed a unique combination of neuroimaging, eye-tracking, and computer-animated virtual agents to assess the neural mechanisms underlying this component of behavior. In the interaction task, participants believed that during each interaction the agent’s gaze behavior could either be controlled by another participant or by a computer program. Their task was to indicate whether they experienced a given interaction as an interaction with another human participant or the computer program based on the agent’s reaction. Unbeknownst to them, the agent was always controlled by a computer to enable a systematic manipulation of gaze reactions by varying the degree to which the agent engaged in joint attention. This allowed creating a tool to distinguish neural activity underlying the subjective experience of being in engaged in social and non-social interaction. In contrast to previous research, this allows measuring neural activity while participants experience active engagement in real-time social interactions. Results demonstrate that gaze-based interactions with a perceived human partner are associated with activity in the ventral striatum, a core component of reward-related neurocircuitry. In contrast, interactions with a computer-driven agent activate attention networks. Comparisons of neural activity during interaction with behaviorally naïve and explicitly cooperative partners demonstrate different temporal dynamics of the reward system and indicate that the mere experience of engagement in social interaction is sufficient to recruit this system.
Human Brain Mapping, 2021
Sharing emotional experiences impacts how we perceive and interact with the world, but the neural mechanisms that support this sharing are not well characterized. In this study, participants (N = 52) watched videos in an MRI scanner in the presence of an unfamiliar peer. Videos varied in valence and social context (i.e., participants believed their partner was viewing the same (joint condition) or a different (solo condition) video). Reported togetherness increased during positive videos regardless of social condition, indicating that positive contexts may lessen the experience of being alone. Two analysis approaches were used to examine both sustained neural activity averaged over time and dynamic synchrony throughout the videos. Both approaches revealed clusters in the medial prefrontal cortex that were more responsive to the joint condition. We observed a timeaveraged social-emotion interaction in the ventromedial prefrontal cortex, although this region did not demonstrate synchrony effects. Alternatively, social-emotion interactions in the amygdala and superior temporal sulcus showed greater neural synchrony in the joint compared to solo conditions during positive videos, but the opposite pattern for negative videos. These findings suggest that positive stimuli may be more salient when experienced together, suggesting a mechanism for forming social bonds.
Brain and Cognition, 2009
Valence and arousal are thought to be the primary dimensions of human emotion. However, the degree to which valence and arousal interact in determining brain responses to emotional pictures is still elusive. This functional MRI study aimed to delineate neural systems responding to valence and arousal, and their interaction. We measured neural activation in healthy females (N = 23) to affective pictures using a 2 (Valence) Â 2 (Arousal) design. Results show that arousal was preferentially processed by middle temporal gyrus, hippocampus and ventrolateral prefrontal cortex. Regions responding to negative valence included visual and lateral prefrontal regions, positive valence activated middle temporal and orbitofrontal areas. Importantly, distinct arousal-by-valence interactions were present in anterior insula (negative pictures), and in occipital cortex, parahippocampal gyrus and posterior cingulate (positive pictures). These data demonstrate that the brain not only differentiates between valence and arousal but also responds to specific combinations of these two, thereby highlighting the sophisticated nature of emotion processing in (female) human subjects.
Frontiers in Human Neuroscience, 2010
Several studies have investigated the neural responses triggered by emotional pictures, but the specificity of the involved structures such as the amygdala or the ventral striatum is still under debate. Furthermore, only few studies examined the association of stimuli's valence and arousal and the underlying brain responses. Therefore, we investigated brain responses with functional magnetic resonance imaging of 17 healthy participants to pleasant and unpleasant affective pictures and afterwards assessed ratings of valence and arousal. As expected, unpleasant pictures strongly activated the right and left amygdala, the right hippocampus, and the medial occipital lobe, whereas pleasant pictures elicited significant activations in left occipital regions, and in parts of the medial temporal lobe. The direct comparison of unpleasant and pleasant pictures, which were comparable in arousal clearly indicated stronger amygdala activation in response to the unpleasant pictures. Most important, correlational analyses revealed on the one hand that the arousal of unpleasant pictures was significantly associated with activations in the right amygdala and the left caudate body. On the other hand, valence of pleasant pictures was significantly correlated with activations in the right caudate head, extending to the nucleus accumbens (NAcc) and the left dorsolateral prefrontal cortex. These findings support the notion that the amygdala is primarily involved in processing of unpleasant stimuli, particularly to more arousing unpleasant stimuli. Reward-related structures like the caudate and NAcc primarily respond to pleasant stimuli, the stronger the more positive the valence of these stimuli is.
This Research Topic features several papers tapping the situated nature of emotion and social cognition processes. The volume covers a broad scope of methodologies [behavioral assessment, functional magnetic resonance imaging (fMRI), structural neuroimaging, event-related potentials (ERPs), brain connectivity, and peripheral measures], populations (non-human animals, neurotypical participants, developmental studies, and neuropsychiatric and pathological conditions), and article types (original research, review papers, and opinion articles). Through this wide-ranging proposal, we introduce a fresh approach to the study of contextual effects in emotion and social cognition domains.
Previous studies have reported the effect of emotion regulation (ER) strategies on both individual and social decision-making, however, the effect of regulation on socially driven emotions independent of decisions is still unclear. In the present study, we investigated the neural effects of using reappraisal to both up-and down-regulate socially driven emotions. Participants played the Dictator Game (DG) in the role of recipient while undergoing fMRI, and concurrently applied the strategies of either up-regulation (reappraising the proposer's intentions as more negative), down-regulation (reappraising the proposer's intentions as less negative), as well as a baseline " look " condition. Results showed that regions responding to the implementation of reappraisal (effect of strategy, that is, " regulating regions ") were the inferior and middle frontal gyrus, temporo parietal junction and insula bilaterally. Importantly, the middle frontal gyrus activation correlated with the frequency of regulatory strategies in daily life, with the insula activation correlating with the perceived ability to reappraise the emotions elicited by the social situation. Regions regulated by reappraisal (effect of regulation, that is, " regulated regions ") were the striatum, the posterior cingulate and the insula, showing increased activation for the up-regulation and reduced activation for down-regulation, both compared to the baseline condition. When analyzing the separate effects of partners' behavior, selfish behavior produced an activation of the insula, not observed when subjects were treated altruistically. Here we show for the first time that interpersonal ER strategies can strongly affect neural responses when experiencing socially driven emotions. Clinical implications of these findings are also discussed to understand how the way we interpret others' intentions may affect the way we emotionally react.
The emotional matching paradigm, introduced by Hariri and colleagues in 2000, is a widely used neuroimaging experiment that reliably activates the amygdala. In the classic version of the experiment faces with negative emotional expression and scenes depicting distressing events are compared with geometric shapes instead of neutral stimuli of the same category (i.e. faces or scenes). This makes it difficult to clearly attribute amygdala activation to the emotional valence and not to the social content. To improve this paradigm, we conducted a functional magnetic resonance imaging study in which emotionally neutral and, additionally, positive stimuli within each stimulus category (i.e. faces, social and non-social scenes) were included. These categories enabled us to differentiate the exact nature of observed effects in the amygdala. First, the main findings of the original paradigm were replicated. Second, we observed amygdala activation when comparing negative to neutral stimuli of the same category. However, for negative faces, the amygdala response habituated rapidly. Third, positive stimuli were associated with widespread activation including the insula and the caudate. This validated adaption study enables more precise statements on the neural activation underlying emotional processing. These advances may benefit future studies on identifying selective impairments in emotional and social stimulus processing. Amygdala functioning is of high interest for clinical psychology, psychiatry and neuroscience, as heightened amygdala activation has been reported in various patient groups 1-5. The emotional matching paradigm by Hariri et al. 6 and its extended version 7 are widely used as emotional reactivity measures, which reliably activate the amygdala 8-10. Despite its current use in psychiatry, this paradigm has a potential drawback since faces with negative emotional expressions and negative social scenes are compared with simple geometric shapes. Thus, it compares pictures that differ in more than one domain: social content and emotional valence. It is therefore difficult to draw conclusions about which of the two different domains causes the increase in amygdala activation. This differentiation may arguably not be relevant for all purposes, but to study specific populations, such as patients with deficits in one or the other domain (e.g. those with autism spectrum disorder (ASD)) 11,12 , it is crucial to distinguish the two. A second issue is that negative emotions have been studied more widely than positive emotions, as exemplified by the original emotional matching paradigm, putatively due to their high functional value for action. For example previous research suggests that threatening scenes, whether they contained faces or no human features, elicited activation in the extrastriate body area, suggesting that this activity to threatening scenes, represents the capacity of the brain to associate certain situations with threat, in order to prepare for fast reactions 13. Positive emotions are, however, the other side of the coin, as they allow psychological growth and well-being 14. Positive stimuli are most commonly used in the context of reward experiments, for example in performance-based feedback tasks 15,16. A brain region that has been related to the processing of various reward types, ranging from primary reinforcers to more abstract social rewards 17,18 , is the ventral striatum 19. Also, meta-analytically, the ventral striatum elicits the strongest activation across the different reward types such as monetary, food and erotic rewards 20. The amygdala was also found to be activated in response to positive stimuli. For direct comparisons of positive and negative faces, not all studies found amygdala activation differences 21 , but a meta-analysis of 1
Brain Structure and Function, 2010
Functional neuroimaging investigations in the fields of social neuroscience and neuroeconomics indicate that the anterior insular cortex (AI) is consistently involved in empathy, compassion, and interpersonal phenomena such as fairness and cooperation. These findings suggest that AI plays an important role in social emotions, hereby defined as affective states that arise when we interact with other people and that depend on the social context. After we link the role of AI in social emotions to interoceptive awareness and the representation of current global emotional states, we will present a model suggesting that AI is not only involved in representing current states, but also in predicting emotional states relevant to the self and others. This model also proposes that AI enables us to learn about emotional states as well as about the uncertainty attached to events, and implies that AI plays a dominant role in decision making in complex and uncertain environments. Our review further highlights that dorsal and ventro-central, as well as anterior and posterior subdivisions of AI potentially subserve different functions and guide different aspects of behavioral regulation. We conclude with a section summarizing different routes to understanding other people's actions, feelings and thoughts, emphasizing the notion that the predominant role of AI involves understanding others' feeling and bodily states rather than their action intentions or abstract beliefs.
2012
Degree of emotional valence and arousal have been shown to covary with blood oxygen level dependent (BOLD) signal levels in several brain structures. Here we studied brain activity in 17 healthy subjects during perception of facial expressions varying in valence and arousal using functional magnetic resonance imaging (fMRI). Our results revealed correlations with the perceived valence in dorsolateral and ventrolateral prefrontal cortex, dorsomedial prefrontal cortex, and anterior insula. These findings corroborate results of our previous study where we used pictures of varying valence taken from the International Affective Picture System (IAPS). Together, the results of these two studies suggest existence of common brain areas processing valence of both emotional pictures and facial expressions. Additionally, BOLD signal exhibited distinctive dependency on perceived valence in intraparietal sulcus and supramarginal gyrus in the present study. BOLD activity correlated with negative and positive valence in separate cortical areas, and some areas demonstrated either a U-shaped or an inverted U-shaped relationship with valence (i.e., either minimal or maximal activation was observed to neutral expressions). This nonlinear dependency suggests that brain mechanisms underlying perception of negative and positive valence are at least to some extent independent. Perceived arousal correlated positively with the strength of the BOLD signal only in the left inferior frontal gyrus, which is an important node of the mirror neuron system.
Neuroscience, 2008
Reading the facial expression of other people is a fundamental skill for social interaction. Human facial expressions of emotions are readily recognized but may also evoke the same experiential emotional state in the observer. We used event-related functional magnetic resonance imaging and multi-channel electroencephalography to determine in 14 right-handed healthy volunteers (29؎ 6 years) which brain structures mediate the perception of such a shared experiential emotional state. Statistical parametric mapping showed that an area in the dorsal medial frontal cortex was specifically activated during the perception of emotions that reflected the seen happy and sad emotional face expressions. This area mapped to the pre-supplementary motor area which plays a central role in control of behavior. Low resolution brain electromagnetic tomography-based analysis of the encephalographic data revealed that the activation was detected 100 ms after face presentation onset lasting until 740 ms. Our observation substantiates recently emerging evidence suggesting that the subjective perception of an experiential emotional state-empathy-is mediated by the involvement of the dorsal medial frontal cortex.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.