Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2019, Proceedings of the 17th International Conference on Virtual-Reality Continuum and its Applications in Industry
…
2 pages
1 file
Common natural walking techniques for navigating in virtual environments feature constraints that make it difficult to use those methods in cramped home environments. Indeed, natural walking requires unobstructed and open space, to allow users to roam around without fear of stumbling on obstacles while immersed in a virtual world. In this work, we propose a new virtual locomotion technique, CWIP-AVR, that allows people to take advantage of the available physical space and empowers them to use natural walking to navigate in the virtual world. To inform users about real world hazards our approach uses augmented virtual reality visual indicators. A user evaluation suggests that CWIP-AVR allows people to navigate safely, while switching between locomotion modes flexibly and maintaining a adequate degree of immersion. CCS CONCEPTS • Human-centered computing → Computer supported cooperative work; Mixed / augmented reality.
arXiv (Cornell University), 2019
New technologies allow ordinary people to access Virtual Reality at affordable prices in their homes. One of the most important tasks when interacting with immersive Virtual Reality is to navigate the virtual environments (VEs). Arguably, the best methods to accomplish this use direct control interfaces. Among those, natural walking (NW) makes for an enjoyable user experience. However, common techniques to support direct control interfaces in VEs feature constraints that make it difficult to use those methods in cramped home environments. Indeed, NW requires unobstructed and open space, to allow users to roam around without fear of stumbling on obstacles while immersed in a virtual world. To approach this problem, we propose a new virtual locomotion technique, which we call Combined Walking in Place (CWIP). CWIP allows people to take advantage of the available physical space and empowers them to use NW to navigate in the virtual world. For longer distances, we adopt Walking in Place (WIP) to enable them to move in the virtual world beyond the confines of a cramped real room. However, roaming in an immersive alternate reality, while moving in the confines of a cluttered environment can lead people to stumble and fall. To approach these problems we developed a technique called Augmented Virtual Reality (AVR), to inform users about real world hazards, such as chairs, drawers, walls via proxies and signs placed in the virtual world. We propose thus Combined Walking in Place in Augmented Virtual Reality (CWIP-AVR) as a way to safely explore VR in the cramped confines of your own home. To our knowledge, this is the first approach to combined different locomotion modalities in a safe manner. We assessed its effectiveness in a user study with 20 participants to validate their ability to navigate a virtual world while walking in a confined and cluttered real space. Our results show that CWIP-AVR allows people to navigate VR safely, while switching between locomotion modes flexibly and maintaining a good degree of immersion.
Presence, 1999
This paper presents both an analysis of requirements for user control over simulated locomotion and a new control technique designed to meet these requirements. The goal is to allow the user to move through virtual environments in as similar a manner as possible to walking through the real world. We approach this problem by examining the interrelationships between motion control and the other actions people use to act, sense, and react to their environment. If the interactions between control actions and sensory feedback can be made comparable to those of actions in the real world, then there is hope for constructing an effective new technique. Candidate solutions are reviewed once the analysis is developed. This analysis leads to a promising new design for a sensor-based virtual locomotion called Gaiter. The new control allows users to direct their movement through virtual environments by stepping in place. The movement of a person's legs is sensed, and in-place walking is treated as a gesture indicating the user intends to take a virtual step. More speci cally, the movement of the user's legs determines the direction, extent, and timing of their movement through virtual environments. Tying virtual locomotion to leg motion allows a person to step in any direction and control the stride length and cadence of his virtual steps. The user can walk straight, turn in place, and turn while advancing. Motion is expressed in a body-centric coordinate system similar to that of actual stepping. The system can discriminate between gestural and actual steps, so both types of steps can be intermixed.
diuf.unifr.ch
This paper presents a new locomotion interface that provides users with the ability to engage in a life-like walking experience by stepping in place. Stepping actions are performed on top of a flat platform with embedded grid of switch sensors that detect footfalls pressure. Based on data received from sensors, the system can compute different variables that represent user's walking behavior such walking direction, walking speed, standstill, jump, and walking. The overall platform status is scanned at a rate of 100Hz with which we can deliver real-time visual feedback reaction to user actions. The proposed system is portable and easy to integrate with major virtual environment with large projection feature such as CAVE and DOME systems. The overall weight of Walking-Pad is less than 5 Kg and can be connected to any computer via USB port, which make it even controllable via a portable computer.
ACM Transactions on Applied Perception, 2010
For us humans, walking is our most natural way of moving through the world. One of the major challenges in present research on navigation in virtual reality is to enable users to physically walk through virtual environments. Although treadmills, in principle, allow users to walk for extended periods of time through large virtual environments, existing setups largely fail to produce a truly immersive sense of navigation. Partially, this is because of inadequate control of treadmill speed as a function of walking behavior. Here, we present a new control algorithm that allows users to walk naturally on a treadmill, including starting to walk from standstill, stopping, and varying walking speed. The treadmill speed control consists of a feedback loop based on the measured user position relative to a given reference position, plus a feed-forward term based on online estimation of the user's walking velocity. The purpose of this design is to make the treadmill compensate fully for any persistent walker motion, while keeping the accelerations exerted on the user as low as possible.
2004
In CAVE-like environments human locomotion is significantly restricted due to physical space and configural constraints. Interaction techniques based upon stepping in place have been suggested as a way to simulate long range locomotion. We describe a new method for step detection and estimation of forward walking speed and direction in an immersive virtual environment. To calibrate our system and to
… and interactive …, 1999
A study by indicated that naive subjects in an immersive virtual environment experience a higher subjective sense of presence when they locomote by walking-in-place (virtual walking) than when they push-button-fly (along the floor plane). We replicated their study, adding real walking as a third condition.
ACM Transactions on Applied Perception
Despite many recent developments in virtual reality, an effective locomotion interface which allows for normal walking through large virtual environments was until recently still lacking. Here, we describe the new CyberWalk omnidirectional treadmill system, which makes it possible for users to walk endlessly in any direction, while never leaving the confines of the limited walking surface. The treadmill system improves on previous designs, both in its mechanical features and in the control system employed to keep users close to the center of the treadmill. As a result, users are able to start walking, vary their walking speed and direction, and stop walking as they would on a normal, stationary surface. The treadmill system was validated in two experiments, in which both the walking behavior and the performance in a basic spatial updating task were compared to that during normal overground walking. The results suggest that walking on the CyberWalk treadmill is very close to normal w...
2007
This paper presents a human performance evaluation of a low-cost enactive locomotion interface, the walking-PAD, that provides users with the ability to engage in a life-like walking experience in virtual environments (VEs) by stepping in place. Stepping actions are performed on top of a platform with embedded grid of switch sensors that detect footfalls pressure. Based on data received from sensors, the system computes different variables that represent user’s walking behavior such as walking direction and walking speed. Twelve human subjects were instructed to reach the exit of a virtual labyrinth as quickly as possible and memorize as many information as they can. Two navigation techniques were compared: a mouse-based technique and the walking-PAD technique. Results revealed that more information was memorized when using the walking PAD.
We present an approach to redirect a user’s walking path by dynamically modifying the geometry of a virtual environment. This method allows real walking through environments that are much larger than the physical tracking area without requiring rotational or translational gains. We demonstrate this technique using a proof-of-concept example environment and explain the modifications at each stage of a walking path through the virtual world. We also discuss the potential advantages of this method and outline several open questions for future investigation.
Compared to real world tasks, completing tasks in a virtual environment (VE) seldom involves the whole spectrum of skills the human body offers. User input in a VE is commonly accomplished through simple finger gestures, such as walking in a scene by simply pressing a button, even if this kind of interaction is not very suitable. In order to create a more intuitive and natural interaction, diverse projects try to tackle the problem of locomotion in VE's by trying to enable a natural walking movement, which is also supposed to increase the level of immersion. Existing solutions such as treadmills are still expensive and need additional fixation of the body. In this paper, we describe a simple and inexpensive way to build a useful locomotion interface using a conventional sports stepper and an Arduino. This device enables control in a VE by walkingin-place and without the need for any additional fixation gadgets. We conducted a user study with 10 participants to evaluate the impression on the joy and ease of use, immersion and reliability in comparison to other interfaces used for locomotion, such as the Wii Balance Board and a Wand Joystick. We found out that the stepper is experienced slightly better in terms of immersion and joy of use. Furthermore, found that pressing buttons on a Joystick was perceived to be more reliable.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
IEEE transactions on visualization and computer graphics, 2017
Lecture Notes in Computer Science, 2019
International Journal of Virtual Reality
Spanish Computer Graphics Conference, 2019
Lecture Notes in Computer Science, 2013
IEEE Transactions on Visualization and Computer Graphics, 2000
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2013