Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2021, Virtual Reality
…
28 pages
1 file
Prior research reveal that spatial navigation skills rely mostly in visual sensory abilities, but the study of how spatial processing operates in the absence of visual information is still incomplete. Therefore, a spatial navigation task in virtual reality using auditory cues was developed to study navigational strategies in sighted individuals. Twenty healthy adult participants were recruited. The task consisted of a VR scene, in which participants were asked to localize the sound source and move to the target without visual information (i.e. blindfolded). Task difficulty was manipulated by route length. The participants were first exposed in a study phase with the objective to move to the sound source and then return to the starting point. In a test phase, the participants performed the same task without the sound source but with auditory cues from obstacles to test spatial learning. This manipulation allowed to assess navigational strategies as local navigation in the first and wayfinding in the second phase. Performance was assessed from behavioral measures of execution time, obstacle collisions, and prompts during the task execution. These variables were correlated with established neuropsychological instruments for global cognition and memory abilities. The results revealed a relationship between executive functioning and task performance. Global performance was better in wayfinding involving spatial learning, while increases in task difficulty affected performance through execution time only for local navigation. These data reveal the importance of auditory information from spatial sound cues for spatial learning and navigation in a known environment.
2009
For individuals who are blind, navigation requires the construction of a cognitive spatial map of one's surrounding environment. Novel technological approaches are being developed to teach and enhance this cognitive skill. Here, we discuss user-centered, audio-based methods of virtual navigation implemented through computer gaming. The immersive, engaging, and heavily interactive nature of the software allows for the generation of mental spatial representations that can be transferred to real-world navigation tasks and, furthermore, promotes creativity and problem-solving skills. Navigation with virtual environments also represents a tractable testing platform to collect quantifiable metrics and monitor learning. Combining this technology with neuroscience research can be used to investigate brain mechanisms related to sensory processing in the absence of vision.
Journal of Experimental Psychology: Applied, 2006
A vibrotactile N-back task was used to generate cognitive load while participants were guided along virtual paths without vision. As participants stepped in place, they moved along a virtual path of linear segments. Information was provided en route about the direction of the next turning point, by spatial language ("left," "right," or "straight") or virtual sound (i.e., the perceived azimuth of the sound indicated the target direction). The authors hypothesized that virtual sound, being processed at direct perceptual levels, would have lower load than even simple language commands, which require cognitive mediation. As predicted, whereas the guidance modes did not differ significantly in the no-load condition, participants showed shorter distance traveled and less time to complete a path when performing the N-back task while navigating with virtual sound as guidance. Virtual sound also produced better N-back performance than spatial language. By indicating the superiority of virtual sound for guidance when cognitive load is present, as is characteristic of everyday navigation, these results have implications for guidance systems for the visually impaired and others.
Sensors
Spatial cognition is a daily life ability, developed in order to be able to understand and interact with our environment. Even if all the senses are involved in mental representation of space elaboration, the lack of vision makes it more difficult, especially because of the importance of peripheral information in updating the relative positions of surrounding landmarks when one is moving. Spatial audio technology has long been used for studies of human perception, particularly in the area of auditory source localisation. The ability to reproduce individual sounds at desired positions, or complex spatial audio scenes, without the need to manipulate physical devices has provided researchers with many benefits. We present a review of several studies employing the power of spatial audio virtual reality for research in spatial cognition with blind individuals. These include studies investigating simple spatial configurations, architectural navigation, reaching to sounds, and sound design...
2005
Abstract A navigation test was carried out in a spatially immersive virtual environment. The test was a gamelike experience where the task of subjects was to find as many gates as possible while they navigated through a track guided by auditory and/or visual cues. The results are presented as a function of the number of found gates, searching times, and normalized path lengths. Audiovisual navigation was clearly the most efficient. Visual navigation was second and the auditory navigation the least efficient.
2000
This study presents the combined efforts of three research groups toward the investigation of a cognitive issue through the development and implementation of a general purpose VR environment that incorporates a high quality virtual 3D audio interface. The psychological aspects of the study concern mechanisms involved in spatial cognition, in particular to determine how a verbal description of an environment
Displays, 2014
It has been shown that multisensory presentation can improve perception, attention, and object memory compared with unisensory presentation. Consequently, we expect that multisensory presentation of landmarks can improve spatial memory and navigation. In this study we tested the effect of visual, auditory and combined landmark presentations in virtual mazes on spatial memory and spatial navigation. Nineteen participants explored four different virtual mazes consisting of nodes with landmarks and corridors connecting them. Each maze was explored for 90 seconds. After each exploration, participants performed the following tasks in fixed order: 1) draw a map of the maze, 2) recall adjacent landmarks for three given landmarks, 3) place all landmarks on the map of the maze, and 4) find their way through the maze to locate five given landmarks in fixed order. Our study shows significant effects of multisensory versus unisensory landmarks for the maze drawing task, the adjacency task, and the wayfinding task. Our results suggest that audiovisual landmark presentations improve spatial memory and spatial navigation performance in virtual environments.
PLoS ONE, 2021
Blind individuals often report difficulties to navigate and to detect objects placed outside their peri-personal space. Although classical sensory substitution devices could be helpful in this respect, these devices often give a complex signal which requires intensive training to analyze. New devices that provide a less complex output signal are therefore needed. Here, we evaluate a smartphone-based sensory substitution device that offers navigation guidance based on strictly spatial cues in the form of horizontally spatialized sounds. The system uses multiple sensors to either detect obstacles at a distance directly in front of the user or to create a 3D map of the environment (detection and avoidance mode, respectively), and informs the user with auditory feedback. We tested 12 early blind, 11 late blind and 24 blindfolded-sighted participants for their ability to detect obstacles and to navigate in an obstacle course. The three groups did not differ in the number of objects detec...
Spatial navigation is a multi-faceted behaviour drawing on many different aspects of cognition. Visuospatial abilities, such as spatial working memory and mental rotation, in particular, may be key factors. A range of tests have been developed to assess visuospatial processing and memory, but how such tests relate to navigation ability remains unclear. This understanding is important to advance tests of navigation for disease monitoring in Alzheimer’s Disease, where disorientation is an early symptom. Here, we report the use of an established mobile gaming app, Sea Hero Quest, as a measure of navigation ability. We used three separate tests of navigation embedded in the game: wayfinding, path integration and spatial memory in a radial arm maze. In the same participants, we also collected measures of mental rotation (Mental Rotation Test), visuospatial processing (Design Organization Test) and visuospatial working memory (Digital Corsi). We found few strong correlations across our me...
Navigating in an unknown environment is a difficult task for the visually impaired people, as they are required to rely primarily on the haptic or auditory cues, in compensation for the lack of sight. However, their orientation and mobility skills are well developed, although the process of building a solid spatial cognitive map of the environment can be rather long and sinuous. The main inquiry that emerges from this matter is whether they are able to virtually learn the architecture of an environment by adapting to the specific auditory and haptic cues that define the setting and to transfer their knowledge into real-world, physical contexts. As the spatial audio technology is capable to render individual sounds at desired locations and to design complex auditory scenarios though binaural sound synthesis, the purpose of virtual auditory environments has extended from localization tests to investigating higher-level cognitive abilities. This paper aims to investigate and discuss the most relevant studies and experiments concerning the ability of visually impaired people to adapt to novel environments, to successfully navigate them virtually by using auditory or haptic cues and to construct a mental representation of the surrounding space. The motivation underlying our research is twofold: firstly, it is aimed to contribute to documenting the way in which the visually impaired people spatially perceive the environment and secondly, to provide insights for the development of a virtual reality navigational device based on auditory or haptic (vibrotactile and kinesthetic) events. Taking into account the findings of our research on the ability of blind individuals to carry out effective cognitive tasks and spatial mental representations, we will also discuss future perspectives for the development of an assistive VR system that would facilitate navigation, orientation and overall spatial awareness.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Journal of Virtual Worlds Research
2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), 2021
Proceedings of the Human Factors and Ergonomics Society Annual Meeting
Negah Institute for Social Research & Scientific Communication , 2020
PsycEXTRA Dataset, 2000