Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2020
Does language change what we perceive? Does speaking different languages cause us to perceive things differently? We review the behavioral and electrophysiological evidence for the influence of language on perception, with an emphasis on the visual modality. Effects of language on perception can be observed both in higher-level processes such as recognition, and in lower-level processes such as discrimination and detection. A consistent finding is that language causes us to perceive in a more categorical way. Rather than being fringe or exotic, as they are sometimes portrayed, we discuss how effects of language on perception naturally arise from the interactive and predictive nature of perception.
Journal of Cultural Cognitive Science
Language and perception are two central cognitive systems. Until relatively recently, however, the interaction between them has been examined only partially and not from an over-arching theoretical perspective. Yet it has become clear that linguistic and perceptual interactions are essential to understanding both typical and atypical human behaviour. In this editorial, we examine the link between language and perception across three domains. First, we present a brief review of work investigating the importance of perceptual features, particularly shape bias, when learning names for novel objects—a critical skill acquired during language development. Second, we describe the Visual World Paradigm, an experimental method uniquely suited to investigate the language-perception relationship. Studies using the Visual World Paradigm demonstrate that the relationship between linguistic and perceptual information during processing is both intricate and bi-directional: linguistic cues guide in...
Color perception has been a traditional test-case of the idea that the language we speak affects our perception of the world. 1 It is now established that categorical perception of color is verbally mediated and varies with culture and language. 2 However, it is unknown whether the well-demonstrated language effects on color discrimination really reach down to the level of visual perception, or whether they only reflect post-perceptual cognitive processes. Using brain potentials in a color oddball detection task with Greek and English speakers, we demonstrate that language effects may exist at a level that is literally perceptual, suggesting that speakers of different languages have differently structured minds.
Journal of Vision, 2010
Different languages divide the color spectrum in different ways. Can such linguistic codes affect color discrimination? Results of three experiments suggest that color language can influence people's color judgments even in conditions when all color stimuli are present at the same time and need not be stored in memory. Two experiments showed that colordiscrimination performance within a language group is affected by verbal interference (and not spatial interference). The third experiment showed that color discrimination performance across a boundary that exists in one language but not another can be altered by linguistic interference only for the language group that codes that linguistic distinction.
Journal of Neuroscience, 2010
We examined the effect of linguistic comprehension on early perceptual encoding in a series of electrophysiological and behavioral studies on humans. Using the fact that pictures of faces elicit a robust and reliable evoked response that peaks at ϳ170 ms after stimulus onset (N170), we measured the N170 to faces that were preceded by primes that referred to either faces or scenes. When the primes were auditory sentences, the magnitude of the N170 was larger when the face stimuli were preceded by sentences describing faces compared to sentences describing scenes. In contrast, when the primes were visual, the N170 was smaller after visual primes of faces compared to visual primes of scenes. Similar opposing effects of linguistic and visual primes were also observed in a reaction time experiment in which participants judged the gender of faces. These results provide novel evidence of the influence of language on early perceptual processes and suggest a surprising mechanistic description of this interaction: linguistic primes produce content-specific interference on subsequent visual processing. This interference may be a consequence of the natural statistics of language and vision given that linguistic content is generally uncorrelated with the contents of perception.
Journal of Experimental Psychology: General, 2015
Language and vision are highly interactive. Here we show that people activate language when they perceive the visual world, and that this language information impacts how speakers of different languages focus their attention. For example, when searching for an item (e.g., clock) in the same visual display, English and Spanish speakers look at different objects. Whereas English speakers searching for the clock also look at a cloud, Spanish speakers searching for the clock also look at a gift, because the Spanish names for gift (regalo) and clock (reloj) overlap phonologically. These different looking patterns emerge despite an absence of direct language input, showing that linguistic information is automatically activated by visual scene processing. We conclude that the varying linguistic information available to speakers of different languages affects visual perception, leading to differences in how the visual world is processed.
Previous studies have shown that language can modulate visual perception, by biasing and/ or enhancing perceptual performance. However, it is still debated where in the brain visual and linguistic information are integrated, and whether the effects of language on perception are automatic and persist even in the absence of awareness of the linguistic material. Here, we aimed to explore the automaticity of language-perception interactions and the neural loci of these interactions in an fMRI study. Participants engaged in a visual motion discrimination task (upward or downward moving dots). Before each trial, a word prime was briefly presented that implied upward or downward motion (e.g., " rise " , " fall "). These word primes strongly influenced behavior: congruent motion words sped up reaction times and improved performance relative to incongruent motion words. Neural congruency effects were only observed in the left middle temporal gyrus, showing higher activity for congruent compared to incongruent conditions. This suggests that higher-level conceptual areas rather than sensory areas are the locus of language-perception interactions. When motion words were rendered unaware by means of masking, they still affected visual motion perception, suggesting that language-perception interactions may rely on automatic feed-forward integration of perceptual and semantic material in language areas of the brain.
Brain Sciences, 2018
Can experience change perception? Here, we examine whether language experience shapes the way individuals process auditory and visual information. We used the McGurk effect—the discovery that when people hear a speech sound (e.g., “ba”) and see a conflicting lip movement (e.g., “ga”), they recognize it as a completely new sound (e.g., “da”). This finding that the brain fuses input across auditory and visual modalities demonstrates that what we hear is profoundly influenced by what we see. We find that cross-modal integration is affected by language background, with bilinguals experiencing the McGurk effect more than monolinguals. This increased reliance on the visual channel is not due to decreased language proficiency, as the effect was observed even among highly proficient bilinguals. Instead, we propose that the challenges of learning and monitoring multiple languages have lasting consequences for how individuals process auditory and visual information
2019
As the world experiences increased international mobility, we encounter those from different racial, ethnic, and linguistic backgrounds. Therefore it is increasingly crucial to examine the ways we categorize and perceive other people. The main objective of this dissertation is to examine whether language is a dimension of social categorization, and whether this affects face perception. We also examined whether language categorization interacts with race categories, and whether this interaction affects the perception of other race faces. These issues were investigated in three studies. Firstly, we used behavioral and event-related potential techniques in an oddball paradigm to test whether language categorization affects visual face perception. We demonstrated that indeed, language is used as a social category, and this categorization affects the early stages of visual perception of faces. Secondly, we examined how language interacts with race in creating social categories. By using ...
Cognitive Neuropsychology, 2020
The role that language plays in shaping non-linguistic cognitive and perceptual systems has been the subject of much theoretical and experimental attention over the past half-century. Understanding how language interacts with non-linguistic systems can provide insight into broader constraints on cognitive and brain organization. The papers that form this volume investigate various ways in which linguistic structure can interact with and influence how speakers think about and perceive the world, and the related issue of the constraints that in turn shape linguistic representations. These theoretical and empirical contributions support deeper understanding of the interactions between language, thought, and perception, and motivate new approaches for developing directional predictions at both the neural and cognitive levels.
BRIGHT : A Journal of English Language Teaching, Linguistics and Literature, 2017
There are several factors that underlie language choice. Factors that are often discuss by the linguist is sociology factor. This article assumes that there is a factor that are also very influential in the choice of language. Perception is one of the reason why people choose the language or change the language. This article presents a concept where language and mind collaborate. This work provides a short explaination of a fundamental cognitive process of the brain and the effect created. This work demontrates that the choices of human language is the effect of human perception. This perception is going to be a consideration for someone when he should utter low or high intonation, choice appropriate language level, or replace his dictions. Understanding how language can change as the effect of human perception is important in linguistics. Therefore this aricle explore how a person interprets and responds based on his perception throught the language.
Psychological Science, 2018
Can our native language influence what we consciously perceive? Although evidence that language modulates visual discrimination has been accumulating, little is known about the relation between language structure and consciousness. We employed electroencephalography and the attentional-blink paradigm, in which targets are often unnoticed. Native Greek speakers ( N = 28), who distinguish categorically between light and dark shades of blue, showed boosted perception for this contrast compared with a verbally unmarked green contrast. Electrophysiological signatures of early visual processing predicted this behavioral advantage. German speakers ( N = 29), who have only one category for light and dark shades of blue, showed no differences in perception between blue and green targets. The behavioral consequence of categorical perception was replicated with Russian speakers ( N = 46), reproducing this novel finding. We conclude that linguistic enhancement of color contrasts provides target...
It is now established that native language affects one’s perception of the world. However, it is unknown whether this effect is merely driven by conscious, language-based evaluation of the environment or whether it reflects fundamental differences in perceptual processing between individuals speaking different languages. Using brain potentials,we demonstrate that the existence in Greek of 2 color terms—ghalazio and ble—distinguishing light and dark blue leads to greater and faster perceptual discrimination of these colors in native speakers of Greek than in native speakers of English. The visualmismatch negativity, an index of automatic and preattentive change detection, was similar for blue and green deviant stimuli during a color oddball detection task in English participants, but it was significantly larger for blue than green deviant stimuli in native speakers of Greek. These findings establish an implicit effect of language-specific terminology on human color perception.
Конференции, 2021
Language and perception have always been two controversial dilemmas in the cognitive system. The interaction between them has been studying not completely but partially throughout the years by linguists. From what we were able to discover, the linguistic and perceptual interactions in human being can explain a whole lot when it comes to comprehension of typical and atypical human behaviors. In this editorial, we thoroughly examine the language perception relationship. In conclusion, while there is compelling evidence to support an ultimate relationship between the linguistic and perceptual systems, the exact levels at which the two systems interact, the timing of the interaction, and what drives the interaction remain largely open remain another dark spot for future to lighten up.
Scientific Reports
Seeing an object is a natural source for learning about the object’s configuration. We show that language can also shape our knowledge about visual objects. We investigated sign language that enables deaf individuals to communicate through hand movements with as much expressive power as any other natural language. A few signs represent objects in a specific orientation. Sign-language users (signers) recognized visual objects faster when oriented as in the sign, and this match in orientation elicited specific brain responses in signers, as measured by event-related potentials (ERPs). Further analyses suggested that signers’ responsiveness to object orientation derived from changes in the visual object representations induced by the signs. Our results also show that language facilitates discrimination between objects of the same kind (e.g., different cars), an effect never reported before with spoken languages. By focusing on sign language we could better characterize the impact of la...
Linguistic labels (e.g., "chair") seem to activate visual properties of the objects to which they refer. Here we investigated whether language-based activation of visual representations can affect the ability to simply detect the presence of an object. We used continuous flash suppression to suppress visual awareness of familiar objects while they were continuously presented to one eye. Participants made simple detection decisions, indicating whether they saw any image. Hearing a verbal label before the simple detection task changed performance relative to an uninformative cue baseline. Valid labels improved performance relative to no-label baseline trials. Invalid labels decreased performance. Labels affected both sensitivity (d′) and response times. In addition, we found that the effectiveness of labels varied predictably as a function of the match between the shape of the stimulus and the shape denoted by the label. Together, the findings suggest that facilitated detection of invisible objects due to language occurs at a perceptual rather than semantic locus. We hypothesize that when information associated with verbal labels matches stimulus-driven activity, language can provide a boost to perception, propelling an otherwise invisible image into awareness.
Color perception has been a traditional test-case of the idea that the language we speak affects our perception of the world. 1 It is now established that categorical perception of color is verbally mediated and varies with culture and language. 2 However, it is unknown whether the well-demonstrated language effects on color discrimination really reach down to the level of visual perception, or whether they only reflect post-perceptual cognitive processes. Using brain potentials in a color oddball detection task with Greek and English speakers, we demonstrate that language effects may exist at a level that is literally perceptual, suggesting that speakers of different languages have differently structured minds. Categorical perception is a term used to describe people's tendency to perceive perceptual continua such as color as discon-tinuous discrete categories, resulting in finer discriminations across category boundaries than within category boundaries. 3,4 It is now widely accepted that categorical perception of color is constrained by language. Whether comparing populations from traditional remote cultures 5-7 or populations matched for technological sophistication and education, 8,9 the findings unequivocally show a discrimination advantage for cross-category over within-category stimuli consistent with the individual's linguistic partition of the color spectrum. This has been shown in both offline similarity judgment tasks 10 as well as in online perceptual matching tasks. 11 However, although these studies suggest that we perceive color categorically, it has been argued that the term Categorical 'Perception' is a misnomer as it is not clear whether response patterns reflect low-level perceptual processes rather than higher-level post-perceptual memory or language processes. 12-14 Thus we cannot dismiss the assumption of a set of universal color categories, which are hard-wired in the human visual system. 15-17 In Thierry, Athanasopoulos, Wiggett, Dering and Kuipers, 18 we measured brain potentials in Greek and English speakers to test the extent to which pre-attentive and unconscious aspects of perception are affected by an individual's native language. Greek differentiates between a light (ghalazio) and a dark (ble) shade of blue. 19 In two experimental blocks, all stimuli were light or dark blue and in the other two blocks the stimuli were light or dark green. We instructed participants to press a button when and only when they saw a square shape (target, probability 20%) within a regularly paced stream of circles (probability 80%). Within one block the most frequent stimulus was a light or dark circle (standard , probability 70%) and the remaining stimuli were circles of the same hue with a contrasting luminance (deviant, probability 10%), i.e., dark if the standard was light or vice versa. Crucially, in this study we analyzed brain wave patterns only for deviance in the color of the circles, not the shape of the stimulus, which was the focus of attention. We expected luminance deviants to elicit visual mismatch negativity (vMMN) in all blocks, indexing pre-attentive change detection, which requires no active response on the part of the participants. 20-22 The vMMN is elicited by deviant (rare) stimuli in visual oddball paradigms, independently of the direction of focused attention 22 and is therefore considered automatic and pre-attentive. 21,22 Consistent with our predictions, we found a vMMN effect of similar magnitude for blue and green contrasts in native speakers of English, but Greek participants perceived luminance deviants as more different in the blue than in the green blocks, which led to a greater vMMN effect for blues. We subsequently explored differences at earlier latencies, focusing on the so-called P1, that is, the first positive peak elicited by visual stimuli over parietooc-cipital regions of the scalp, to test for potential differences between participant groups in a time frame associated with activity in the primary and secondary visual cortices. 23 To our surprise, analysis of mean peak latencies and mean signal amplitudes between 100 and 130 ms revealed that the P1 peak followed a pattern of differences
Memory & Cognition, 1975
The Whorf-Sapir hypothesis has raised considerable controversy in the literatures of psychology and anthropology. Several misconceptions of the hypothesis are reviewed, and the hypothesis was experimentally supported in a visual reproduction paradigm. Subjects were first given label training for a set of figures, and were then asked to recall by drawing the shapes. Training with categorized labels resulted in a 25% improvement in recall when compared to a condition with nonword (paralog) labels. Even stronger evidence of linguistic influence on visual memory was obtained by examining the order of recall. The conceptual relationships among labels strongly influenced the sequence of reproductions. Does language in any way influence the memory of visual stimuli? This question is directly related to the Whorf-Sapir (Whorfian) hypothesis which suggests that language is a powerful determinant of perception and thought. The hypothesis has never been clearly or singularly expressed, and so exists in several versions. A common distinction is made between "weak" and "strong" forms of the Whorfian hypothesis (Funkhouser, 1968; Slobin, 1971). The strong form, which holds that aspects of language can determine though t, is routinely rejected by experimental psychologists. The weak form suggests that "One is not fully a prisoner of one's language" but that language "can predispose people to think or act in one way rather than another" (Slobin, 1971, p. 122). Actually, the distinction is quite misleading, and probably stands as an excellent example of the Whorfian hypothesis in action. Psychology has, for the most part, adopted a probabilistic view of behavior; consequently, to say that language merely influences behavior is only to assert a scientific axiom. There is nothing necessarily "weak" about a variable which influences but does not strictly determine behavior, since influence is the strongest form of causality allowed to our science. It is interesting that labeling the hypothesis as "weak" seems to have led to a ready acceptance, but also a consequent devaluation of its importance, e.g., Slobin, 1971. While it is not reasonable for psychologists to distinguish the strong and weak forms of the Whorfian hypothesis, it is possible to consider the strength and extent of the linguistic influence. Investigations of linguistic influence on our sensory-perceptual experience are few and their findings are not compelling. Lenneberg (1967, p. 348) reports virtually no effect of different linguistic conventions on the ability to discriminate color hues. More promising support for the hypothesis Requests for reprints should be addressed to
How can perception be altered by language?" is the fundamental question of this article. Indeed, various studies have pointed out the influence of colour-related knowledge on object and colour perception, evoked by linguistic stimuli. Here the relevance of the simulationist approach is assumed in order to explain this influence, where the understanding of colour-related words or sentences involves a process of colour simulation that is supported by a neuronal network partially similar to the network involved in colour perception. Consequently, colour-related knowledge and colour perception can interact through a process of pattern interference. In support of this idea, studies are discussed showing priming effects between colour simulation and colour perception, but two limitations are also raised. Firstly, these works all used between-category colour discrimination tasks that allow the intervention of lexical processes that can also explain priming. Secondly, these works control the congruency link between prime and target at the level of 'colour category', and no demonstration is made of an influence at the level of specific hues. Consequently, the simulationist view of language/perception interactions seems an interesting way to thinking but more experimens are needed in order to overcome some limitations.
Journal of Memory and Language, 2007
Researchers in psycholinguistics are increasingly interested in the question of how linguistic and visual information are integrated during language processing. In part, this trend is attributable to the use of the so-called ''visual world paradigm'' in psycholinguistics, in which participants look at and sometimes manipulate objects in a visual world as they listen to spoken utterances or generate utterances of their own. In this introductory article to the Special Issue on Language-Vision Interactions, we briefly describe the history of attempts to look at the integration of language and vision, and we preview the articles appearing in the special issue. From those articles, it is clear that recent work has dramatically expanded our understanding of this important question, a trend that will only accelerate as theoretical and methodological advances continue to be made. Ó 2007 Elsevier Inc. All rights reserved.
Categorical perception (CP) facilitates the discrimination of stimuli belonging to different categories relative to those from the same category. Effects of CP on the discrimination of color and shape have been attributed to the top-down modulation of visual perception by the left-lateralized language processes. We used a divided visual field (DVF) search paradigm to investigate the prospective effects of CP on face identity and gender processing. Consistent with visual processing of face identity in the right hemisphere, we found CP facilitated perception only in the left visual field (LVF). In contrast, and consistent with language-induced CP, we observed a between-category advantage for processing face gender only in the right visual field (RVF). Taken together, our results suggest that language-induced CP plays a role in the category-based visual processing of faces by the left hemisphere, but face familiarity processing might be dependent on different, identity-specialized netw...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.