Papers by Sayuri Hayakawa

Journal of Neurolinguistics, 2019
When considering how particular features of an organism develop, it is often intuitive to turn to... more When considering how particular features of an organism develop, it is often intuitive to turn to examples of phylogenetic adaptation, or evolutionary change over time. The rapid darkening of peppered moths of industrial-era London provide a compelling case, as light-colored moths became easier for predators to spot in a city increasingly blackened by soot . Similarly, it is intuitive that the human brain evolved to allow for functions that would have benefited the survival of our ancestors, such as the ability to see and coordinate movement. What is perhaps less obvious, is how the brain carries out functions that are too new to have been selected for during human evolution, such as precise mathematics and language. In their article "Neuroemergentism: A framework for studying cognition and the brain," Hernandez et al. (in press) review evidence showing that cognitive functions and their underlying neural bases can be best understood by considering both phylogenetic evolution of the brain, as well as ontogenetic development over an individual's lifespan to account for the impact of experience. The authors consider the theories of Neuronal Recycling, Neural Reuse, and Language as Shaped by the Brain, and explain that skills such as reading and numerical processing are made possible by recruiting existing, older neural structures for new functions. They further argue that a comprehensive theory should include consideration of developmental changes over time, and introduce Neuroemergentism as a potential unifying account that seeks to explain how pre-existing elements shaped through evolution are repurposed to meet an organism's developmental needs.

Journal of Speech, Language, and Hearing Research, 2018
Purpose
Understanding speech often involves processing input from multiple modalities. The availa... more Purpose
Understanding speech often involves processing input from multiple modalities. The availability of visual information may make auditory input less critical for comprehension. This study examines whether the auditory system is sensitive to the presence of complementary sources of input when exerting top-down control over the amplification of speech stimuli.
Method
Auditory gain in the cochlea was assessed by monitoring spontaneous otoacoustic emissions (SOAEs), which are by-products of the amplification process. SOAEs were recorded while 32 participants (23 women, nine men; M age = 21.13) identified speech sounds such as “ba” and “ga.” The speech sounds were presented either alone or with complementary visual input, as well as in quiet or with 6-talker babble.
Results
Analyses revealed that there was a greater reduction in the amplification of noisy auditory stimuli compared with quiet. This reduced amplification may aid in the perception of speech by improving the signal-to-noise ratio. Critically, there was a greater reduction in amplification when speech sounds were presented bimodally with visual information relative to when they were presented unimodally. This effect was evidenced by greater changes in SOAE levels from baseline to stimuli presentation in audiovisual trials relative to audio-only trials.
Conclusions
The results suggest that even the earliest stages of speech comprehension are modulated by top-down influences, resulting in changes to SOAEs depending on the presence of bimodal or unimodal input. Neural processes responsible for changes in cochlear function are sensitive to redundancy across auditory and visual input channels and coordinate activity to maximize efficiency in the auditory periphery.

Brain Sciences, 2018
Can experience change perception? Here, we examine whether language experience shapes the way ind... more Can experience change perception? Here, we examine whether language experience shapes the way individuals process auditory and visual information. We used the McGurk effect—the discovery that when people hear a speech sound (e.g., “ba”) and see a conflicting lip movement (e.g., “ga”), they recognize it as a completely new sound (e.g., “da”). This finding that the brain fuses input across auditory and visual modalities demonstrates that what we hear is profoundly influenced by what we see. We find that cross-modal integration is affected by language background, with bilinguals experiencing the McGurk effect more than monolinguals. This increased reliance on the visual channel is not due to decreased language proficiency, as the effect was observed even among highly proficient bilinguals. Instead, we propose that the challenges of learning and monitoring multiple languages have lasting consequences for how individuals process auditory and visual information

Frontiers in Neuroscience, 2018
Auditory sensation is often thought of as a bottom-up process, yet the brain exerts top-down cont... more Auditory sensation is often thought of as a bottom-up process, yet the brain exerts top-down control to affect how and what we hear. We report the discovery that the magnitude of top-down influence varies across individuals as a result of differences in linguistic background and executive function. Participants were 32 normal-hearing individuals (23 female) varying in language background (11 English monolinguals, 10 Korean-English late bilinguals, and 11 Korean-English early bilinguals), as well as cognitive abilities (working memory, cognitive control). To assess efferent control over inner ear function, participants were presented with speech-sounds (e.g., /ba/, /pa/) in one ear while spontaneous otoacoustic emissions (SOAEs) were measured in the contralateral ear. SOAEs are associated with the amplification of sound in the cochlea, and can be used as an index of top-down efferent activity. Individuals with bilingual experience and those with better cognitive control experienced larger reductions in the amplitude of SOAEs in response to speech stimuli, likely as a result of greater efferent suppression of amplification in the cochlea. This suppression may aid in the critical task of speech perception by minimizing the disruptive effects of noise. In contrast, individuals with better working memory exert less control over the cochlea, possibly due to a greater capacity to process complex stimuli at later stages. These findings demonstrate that even peripheral mechanics of auditory perception are shaped by top-down cognitive and linguistic influences.
Using a Foreign Language Changes Our Choices
Trends in cognitive sciences, 2016
A growing literature demonstrates that using a foreign language affects choice. This is surprisin... more A growing literature demonstrates that using a foreign language affects choice. This is surprising because if people understand their options, choice should be language independent. Here, we review the impact of using a foreign language on risk, inference, and morality, and discuss potential explanations, including reduced emotion, psychological distance, and increased deliberation.
Killing the Fat Man or el Hombre Gordo: Why Are People More Utilitarian in a Foreign Language?
PsycEXTRA Dataset, 2014
Adaptation-induced blindness
Journal of Vision, 2010

Journal of Vision, 2010
It is well known that prolonged observation of a dynamic visual pattern raises the contrast thres... more It is well known that prolonged observation of a dynamic visual pattern raises the contrast threshold for a subsequently presented static pattern. We found that if the post-adaptation test was presented gradually, so that its onset transient was weak, the test pattern was undetectable even at high contrast. Although the smooth-onset patterns were invisible, they caused apparent shifts in the orientation and contrast of neighboring stimuli, indicating the implicit processing of the target features. However, this strong aftereffect was not obtained if the target grating drifted rapidly or was onset abruptly. These results suggest that when human observers become less sensitive to transients in stimuli due to dynamic adaptation, they cannot consciously perceive sluggish stimuli containing weak transients. This is consistent with the notion that the visual system cannot prompt a conscious awareness of a single stimulus unless triggered by enough transient or temporally salient signals.
Though moral intuitions and choices seem fundamental to our core being, there is surprising new e... more Though moral intuitions and choices seem fundamental to our core being, there is surprising new evidence that people resolve moral dilemmas differently when they consider them in a foreign language (Cipolletti et al., 2016; Costa et al., 2014a; Geipel et al., 2015): People are more willing to sacrifice 1 person to save 5 when they use a foreign language compared with when they use their native tongue. Our findings show that the phenomenon is robust across various contexts and that multiple factors affect it, such as the severity of the negative consequences associated with saving the larger group. This has also allowed us to better describe the phenomenon and investigate potential explanations. Together, our results suggest that the foreign language effect is most likely attributable to an increase in psychological distance and a reduction in emotional response.
Thinking in a Foreign Tongue Reduces Decision Biases
PsycEXTRA Dataset, 2000
PLoS ONE, 2014
Should you sacrifice one man to save five? Whatever your answer, it should not depend on whether ... more Should you sacrifice one man to save five? Whatever your answer, it should not depend on whether you were asked the question in your native language or a foreign tongue so long as you understood the problem. And yet here we report evidence that people using a foreign language make substantially more utilitarian decisions when faced with such moral dilemmas. We argue that this stems from the reduced emotional response elicited by the foreign language, consequently reducing the impact of intuitive emotional concerns. In general, we suggest that the increased psychological distance of using a foreign language induces utilitarianism. This shows that moral judgments can be heavily affected by an orthogonal property to moral principles, and importantly, one that is relevant to hundreds of millions of individuals on a daily basis.

The Foreign-Language Effect: Thinking in a Foreign Tongue Reduces Decision Biases
Psychological Science, 2012
Would you make the same decisions in a foreign language as you would in your native tongue? It ma... more Would you make the same decisions in a foreign language as you would in your native tongue? It may be intuitive that people would make the same choices regardless of the language they are using, or that the difficulty of using a foreign language would make decisions less systematic. We discovered, however, that the opposite is true: Using a foreign language reduces decision-making biases. Four experiments show that the framing effect disappears when choices are presented in a foreign tongue. Whereas people were risk averse for gains and risk seeking for losses when choices were presented in their native tongue, they were not influenced by this framing manipulation in a foreign language. Two additional experiments show that using a foreign language reduces loss aversion, increasing the acceptance of both hypothetical and real bets with positive expected value. We propose that these effects arise because a foreign language provides greater cognitive and emotional distance than a native tongue does.
Uploads
Papers by Sayuri Hayakawa
Understanding speech often involves processing input from multiple modalities. The availability of visual information may make auditory input less critical for comprehension. This study examines whether the auditory system is sensitive to the presence of complementary sources of input when exerting top-down control over the amplification of speech stimuli.
Method
Auditory gain in the cochlea was assessed by monitoring spontaneous otoacoustic emissions (SOAEs), which are by-products of the amplification process. SOAEs were recorded while 32 participants (23 women, nine men; M age = 21.13) identified speech sounds such as “ba” and “ga.” The speech sounds were presented either alone or with complementary visual input, as well as in quiet or with 6-talker babble.
Results
Analyses revealed that there was a greater reduction in the amplification of noisy auditory stimuli compared with quiet. This reduced amplification may aid in the perception of speech by improving the signal-to-noise ratio. Critically, there was a greater reduction in amplification when speech sounds were presented bimodally with visual information relative to when they were presented unimodally. This effect was evidenced by greater changes in SOAE levels from baseline to stimuli presentation in audiovisual trials relative to audio-only trials.
Conclusions
The results suggest that even the earliest stages of speech comprehension are modulated by top-down influences, resulting in changes to SOAEs depending on the presence of bimodal or unimodal input. Neural processes responsible for changes in cochlear function are sensitive to redundancy across auditory and visual input channels and coordinate activity to maximize efficiency in the auditory periphery.
Understanding speech often involves processing input from multiple modalities. The availability of visual information may make auditory input less critical for comprehension. This study examines whether the auditory system is sensitive to the presence of complementary sources of input when exerting top-down control over the amplification of speech stimuli.
Method
Auditory gain in the cochlea was assessed by monitoring spontaneous otoacoustic emissions (SOAEs), which are by-products of the amplification process. SOAEs were recorded while 32 participants (23 women, nine men; M age = 21.13) identified speech sounds such as “ba” and “ga.” The speech sounds were presented either alone or with complementary visual input, as well as in quiet or with 6-talker babble.
Results
Analyses revealed that there was a greater reduction in the amplification of noisy auditory stimuli compared with quiet. This reduced amplification may aid in the perception of speech by improving the signal-to-noise ratio. Critically, there was a greater reduction in amplification when speech sounds were presented bimodally with visual information relative to when they were presented unimodally. This effect was evidenced by greater changes in SOAE levels from baseline to stimuli presentation in audiovisual trials relative to audio-only trials.
Conclusions
The results suggest that even the earliest stages of speech comprehension are modulated by top-down influences, resulting in changes to SOAEs depending on the presence of bimodal or unimodal input. Neural processes responsible for changes in cochlear function are sensitive to redundancy across auditory and visual input channels and coordinate activity to maximize efficiency in the auditory periphery.