Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2014, Cognitive, Affective, & Behavioral Neuroscience
…
11 pages
1 file
Recent evidence suggests a relative righthemispheric specialization for emotional prosody perception, whereas linguistic prosody perception is under bilateral control. It is still unknown, however, how the hemispheric specialization for prosody perception might arise. Two main hypotheses have been put forward. Cue-dependent hypotheses, on the one hand, propose that hemispheric specialization is driven by specialization for the non-prosody-specific processing of acoustic cues. The functional lateralization hypothesis, on the other hand, proposes that hemispheric specialization is dependent on the communicative function of prosody, with emotional and linguistic prosody processing being lateralized to the right and left hemispheres, respectively. In the present study, the functional lateralization hypothesis of prosody perception was systematically tested by instructing one group of participants to evaluate the emotional prosody, and another group the linguistic prosody dimension of bidimensional prosodic stimuli in a dichotic-listening paradigm, while event-related potentials were recorded. The results showed that the right-ear advantage was associated with decreased latencies for an early negativity in the contralateral hemisphere. No evidence was found for functional lateralization. These findings suggest that functional lateralization effects for prosody perception are small and support the structural model of dichotic listening.
Neuroimage, 2004
Speech prosody is processed in neither a single region nor a specific hemisphere, but engages multiple areas comprising a large-scale spatially distributed network in both hemispheres. It remains to be elucidated whether hemispheric lateralization is based on higher-level prosodic representations or lower-level encoding of acoustic cues, or both. A cross-language (Chinese; English) fMRI study was conducted to examine brain activity elicited by selective attention to Chinese intonation (I) and tone (T) presented in three-syllable (I3, T3) and onesyllable (I1, T1) utterance pairs in a speeded response, discrimination paradigm. The Chinese group exhibited greater activity than the English in a left inferior parietal region across tasks (I1, I3, T1, T3). Only the Chinese group exhibited a leftward asymmetry in inferior parietal and posterior superior temporal (I1, I3, T1, T3), anterior temporal (I1, I3, T1, T3), and frontopolar (I1, I3) regions. Both language groups shared a rightward asymmetry in the mid portions of the superior temporal sulcus and middle frontal gyrus irrespective of prosodic unit or temporal interval. Hemispheric laterality effects enable us to distinguish brain activity associated with higher-order prosodic representations in the Chinese group from that associated with lower-level acoustic/auditory processes that are shared among listeners regardless of language experience. Lateralization is influenced by language experience that shapes the internal prosodic representation of an external auditory signal. We propose that speech prosody perception is mediated primarily by the RH, but is left-lateralized to task-dependent regions when language processing is required beyond the auditory analysis of the complex sound. D
Perception & Psychophysics, 1972
Neuropsychologia
Cognitive functions, for example speech processing, are distributed asymmetrically in the two hemispheres that mostly have homologous anatomical structures. Dichotic listening is a well-established paradigm to investigate hemispherical lateralization of speech. However, the mixed results of dichotic listening, especially when using tonal languages as stimuli, complicates the investigation of functional lateralization. We hypothesized that the inconsistent results in dichotic listening are due to an interaction in processing a mixture of acoustic and linguistic attributes that are differentially processed over the two hemispheres. In this study, a within-subject dichotic listening paradigm was designed, in which different levels of speech and linguistic information was incrementally included in different conditions that required the same tone identification task. A left ear advantage (LEA), in contrast with the commonly found right ear advantage (REA) in dichotic listening, was observed in the hummed tones condition, where only the slow frequency modulation of tones was included. However, when phonemic and lexical information was added in simple vowel tone conditions, the LEA became unstable. Furthermore, ear preference became balanced when phonological and lexicalsemantic attributes were included in the consonant-vowel (CV), pseudo-word, and word conditions. Compared with the existing REA results that use complex vowel word tones, a complete pattern emerged gradually shifting from LEA to REA. These results support the hypothesis that an acoustic analysis of suprasegmental information of tones is preferably processed in the right hemisphere, but is influenced by phonological and lexical semantic processes residing in the left hemisphere. The ear preference in dichotic listening depends on the levels of speech and linguistic analysis and preferentially lateralizes across the different hemispheres. That is, the manifestation of functional lateralization depends on the integration of information across the two hemispheres.
This study examined the effect of sad prosody on hemispheric specialization for word processing using behavioral and electrophysiological measures. A dichotic listening task combining focused attention and signal-detection methods was conducted to evaluate the detection of a word spoken in neutral or sad prosody. An overall right ear advantage together with leftward lateralization in early (150–170 ms) and late (240–260 ms) processing stages was found for word detection, regardless of prosody. Furthermore, the early stage was most pronounced for words spoken in neutral prosody, showing greater negative activation over the left than the right hemisphere. In contrast, the later stage was most pronounced for words spoken with sad prosody, showing greater positive activation over the left than the right hemisphere. The findings suggest that sad prosody alone was not sufficient to modulate hemispheric asymmetry in word-level processing. We posit that lateralized effects of sad prosody on word processing are largely dependent on the psychoacoustic features of the stimuli as well as on task demands.
Brain and Cognition, 2009
The present study investigated the influence of within-and between-ear congruency on interference and laterality effects in an auditory semantic/prosodic conflict task. Participants were presented dichotically with words (e.g., mad, sad, glad) pronounced in either congruent or incongruent emotional tones (e.g., angry, happy, or sad) and identified a target word or emotion under one of two conditions. In the within-ear condition, the congruent or incongruent dimensions were bound within a single stimulus and therefore, presented to the same ear. In the between-ear condition, the two dimensions were split between two stimuli and, therefore, presented in separate ears. Findings indicated interference in both conditions. However, the expected right ear advantage (EA) for words and left EA for emotions were obtained only in the between-ear condition. Factors involved in producing interference and laterality effects in dichotic listening tasks are discussed.
Lateralization of cognitive functioning is a well-established principle of cerebral organization. The left and right hemispheres are known to play distinct and complementary roles in the processing of information. What is still unclear is whether these asymmetrically lateralized functions have a common or distinct developmental origin; are left and right processes lateralized through causal influences, or is the laterality of each function independently influenced? Left-and right-lateralized functions are commonly assessed in isolation, with little attention to the relationship in the degree and direction of lateralization within individuals. This relationship between left-hemisphere processing of speech sounds and right-hemisphere processing of emotional vocalizations was examined using dichotic listening tasks. An overall complementary pattern of lateralization was observed across participants, but no significant relationship was found for degree of lateralization of speech and emotional vocalization processing within individuals. These results support the view that functions in the left and right hemispheres are independently lateralized.
Neuropsychologia, 2003
In dichotic listening, a right ear advantage for linguistic tasks reflects left hemisphere specialization, and a left ear advantage for prosodic tasks reflects right hemisphere specialization. Three experiments used a response hand manipulation with a dichotic listening task to distinguish between direct access (relative specialization) and callosal relay (absolute specialization) explanations of perceptual asymmetries for linguistic and prosodic processing. Experiment 1 found evidence for direct access in linguistic processing and callosal relay in prosodic processing. Direct access for linguistic processing was found to depend on lexical status (Experiment 2) and affective prosody (Experiment 3). Results are interpreted in terms of a dynamic model of hemispheric specialization in which right hemisphere contributions to linguistic processing emerge when stimuli are words, and when they are spoken with affective prosody.
Progress in Brain Research, 2006
Recently, research on the lateralization of linguistic and nonlinguistic (emotional) prosody has experienced a revival. However, both neuroimaging and patient evidence do not draw a coherent picture substantiating right-hemispheric lateralization of prosody and emotional prosody in particular. The current overview summarizes positions and data on the lateralization of emotion and emotional prosodic processing in the brain and proposes that: (1) the realization of emotional prosodic processing in the brain is based on differentially lateralized subprocesses and (2) methodological factors can influence the lateralization of emotional prosody in neuroimaging investigations. Latter evidence reveals that emotional valence effects are strongly right lateralized in studies using compact blocked presentation of emotional stimuli. In contrast, data obtained from event-related studies are indicative of bilateral or left-accented lateralization of emotional prosodic valence. These findings suggest a strong interaction between language and emotional prosodic processing.
Neuropsychologia, 2011
It is unclear whether there is hemispheric specialization for prosodic perception and, if so, what the nature of this hemispheric asymmetry is. Using the lesion-approach, many studies have attempted to test whether there is hemispheric specialization for emotional and linguistic prosodic perception by examining the impact of left vs. right hemispheric damage on prosodic perception task performance. However, so far no consensus has been reached. In an attempt to find a consistent pattern of lateralization for prosodic perception, a meta-analysis was performed on 38 lesion studies (including 450 left hemisphere damaged patients, 534 right hemisphere damaged patients and 491 controls) of prosodic perception. It was found that both left and right hemispheric damage compromise emotional and linguistic prosodic perception task performance. Furthermore, right hemispheric damage degraded emotional prosodic perception more than left hemispheric damage (trimmed g = −0.37, 95% CI [−0.66; −0.09], N = 620 patients). It is concluded that prosodic perception is under bihemispheric control with relative specialization of the right hemisphere for emotional prosodic perception.► Both cerebral hemispheres are necessary for adequate emotional and linguistic prosodic perception. ► The right hemisphere is more important than the left hemisphere for emotional prosodic perception. ► Prosodic processing is under bihemispheric control with relative right hemispheric lateralization for emotional perception.
Neuroscience Letters, 1998
among repetitive standard [da)]syllables were recorded in subjects who either attended to these stimuli in order to discriminate the [ba] syllables or ignored them while attending a silent movie. In both conditions, the deviant syllables elicited a mismatch response (MMNm, the magnetic counterpart of mismatch negativity), which was stronger in the left than in the right auditory cortex, indicating left-hemispheric dominance in speech processing already at a preattentive processing level.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Brain and Cognition, 2007
NeuroImage, 2014
Laterality: Asymmetries of Body, Brain and Cognition
Neuroscience Letters, 2008
Brain and language, 2003
Perception & Psychophysics, 1973
Perception & Psychophysics, 1974
NeuroReport, 1999
Brain and language, 1998
Neuropsychologia, 2008
International Journal of Audiology, 2014
Brain and Language, 2001
Journal of Phonetics, 1973
Cortex, 2006
TAL2018, Sixth International Symposium on Tonal Aspects of Languages, 2018
Brain and Language, 2013
Laterality: Asymmetries of Body, Brain and Cognition, 2017
Brain and Language, 2004