Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2008, Lecture Notes in Computer Science
…
9 pages
1 file
We present a platform that offers designers flexibility on device design, fast prototyping, and integration of new devices to a mixed reality infrastructure. Our solution is based on the integration of a commercial embedded system, the Qwerk, and the Virtual Reality Peripheral Network (VRPN), a network-transparent interface between applications and typical virtual reality (VR) devices. This solution creates a hardware and software layer between new devices and VR applications that facilitate development. We show here three examples. The first application targets a simple device for object movement on an axis. The second one consists on a device that changes the color of a virtual object based on temperature. Finally, we design, build, and test a new wireless navigation device, our "Light Gloves".
Frontiers in virtual reality, 2022
If Mixed Reality applications are supposed to become truly ubiquitous, they face the challenge of an ever evolving set of hardware and software systems-each with their own standards and APIs-that need to work together and become part of the same shared environment (the application). A unified standard is unlikely so we can not rely on a single software development stack to incorporate all necessary parts. Instead we need frameworks that are modular and flexible enough to be adapted to the needs of the application at hand and are able to incorporate a wide range of setups for devices, services, etc. We identified a set of common questions that can be used to characterize and analyze Mixed Reality applications and use these same questions to identify challenges as well as present solutions in the form of three frameworks tackling the fields of tracking and inference (UbiTrack), interaction (Ubi-Interact) and visualization (UbiVis). Tracking and inference has been addressed for quite some time now while interaction is a current topic with existing solutions. Visualization will be focused more in the future. We present several applications in development together with their future vision and explain how the frameworks help realize these and other potential apps.
2006
We have designed a wearable Mixed Reality (MR) framework which allows to real-time render game-like 3D scenes on see-through head-mounted displays (see through HMDs) and to localize the user position within a known internet wireless area. Our equipment weights less than 1 Pound (0.45 Kilos). The information visualized on the mobile device could be sent on-demand from a remote server and realtime rendered onboard. We present our PDA-based platform as a valid alternative to use in wearable MR contexts under less mobility and encumbering constraints: our approach eliminates the typical backpack with a laptop, a GPS antenna and a heavy HMD usually required in this cases. A discussion about our results and user experiences with our approach using a handheld for 3D rendering is presented as well.
2020
Virtual Reality is an immersive and powerful technology which is already changing computing, entertainment, education, and social networking. Modern VR headsets are capable of comfortably delivering high-resolution, high-framerate content and providing fully mobile motion tracking. Consumer VR systems typically consist of a tracked headset and two tracked hand controllers. However, the system format and technology implementation of commercial VR headsets introduce limitations in the user experience. In this project, we identify three specific interaction limitations present in modern VR and devise a hardware solution for each. The three issues we aim to improve are finger presence, two-handed rigid virtual object interactions, and locomotion. The first interaction limitation, finger presence, is the sense of movement control of the virtual hand’s fingers when in VR. Another issue arises when interacting with rigid two-handed virtual objects, as the user is typically using two separa...
2019 IEEE 5th World Forum on Internet of Things (WF-IoT)
Virtual Reality (VR) allows users to interact with intuitive environments to manipulate games, applications and even other Internet of Things (IoT) devices. VR devices have been brought to users by different vendors, supporting different development platforms and features. Many VR applications, however, do not support simultaneous VR devices from different vendors. The few applications that support this feature are not open to developers, with little information on their implementation and application performance. Using VRTK and Unity, this paper intends to present a VR application that supports multiple VR hardware platforms with the use of the VRTK SDK along with the appropriate VR hardware SDKs. This application supports multiple users in a VR environment where they can interact with each other and with objects in the VR space. Other features such as file transfer over the VR application are also supported. Such an application is important to allow developers to design future applications that support multiple devices regardless of the vendor, creating convenient multiuser VR applications with less limitations.
2015 IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, Autonomic and Secure Computing; Pervasive Intelligence and Computing, 2015
The advancement of recent developments over the VR with the expansion of new Head Mount Displays (H.M.D.) such as Oculus Rift and Morpheus have opened new challenges in the already active research filed of the industry of Human-Computer Interaction (HCI) by exploring new means of communication with the support of the new hardware devices adjustable to body movements and hand position. The paper explores the hardware interactivity and VR H.M.D's through two games designed to use the latest Oculus Rift SDK technology with alternative methods of hardware communication. A usability evaluation study was conducted with 18 participants and the results presented and discussed.
International Journal of Online Engineering (iJOE), 2013
Virtual reality (VR) systems have the potential for alleviating the existing constraints on various natural and social resources. Currently, real-time applications of VR systems are hampered by the tediousness of creating virtual environments. Furthermore, todayâ??s VR systems only stimulate the human senses of vision, hearing and â?? to some extent touch â?? which prevents the system users to feel fully immersed in the virtual environment. By integrating real physical devices with virtual environments, the user interactions with such systems can be improved and advanced technologies such as the MS Kinect system could be used to augment the environments themselves. While existing development platforms for VR systems are expensive, game engines provide a more efficient method for integrating VR with physical devices. In this paper, an efficient approach for integrating virtual environments and physical devices is presented. This approach employs modifications of games that are based ...
Lecture Notes in Computer Science, 2007
While software developers for desktop applications can rely on mouse and keyboard as standard input devices, developers of virtual reality (VR) and augmented reality (AR) applications usually have to deal with a large variety of individual interaction devices. Existing device abstraction layers provide a solution to this problem, but are usually limited to a specific set or type of input devices. In this paper we introduce DEVAL-an approach to a device abstraction layer for VR and AR applications. DEVAL is based on a device hierarchy that is not limited to input devices, but naturally extends to output devices.
A distributed mixed-reality (MR) or virtual reality (VR) environment implies the cooperative engagement of a set of software and hardware resources. With the advances in sensors and computer networks we have seen an increase in the number of potential MR/VR applications that require large amounts of information from the real world collected through sensors (e.g. position and orientation tracking sensors). These sensors collect data from the real environment in real-time at different locations and a distributed environment connecting them must assure data distribution among collaborative sites at interactive speeds. With the advances in sensor technology, we envision that in future systems a significant amount of data will be collected from sensors and devices attached to the participating nodes This paper proposes a new architecture for sensor based interactive distributed MR/VR environments that falls inbetween the atomistic peer-to-peer model and the traditional client-server model. Each node is autonomous and fully manages its resources and connectivity. The dynamic behavior of the nodes is dictated by the human participants that manipulate the sensors attached to these nodes.
2019
Mixed Reality is an upcoming domain that focuses on enhancing the user experience to become more interactive with Virtual Reality. Hand is a part of body that allows human to touch and feel different things. VR glove is the most efficient and newest tool for a user to interact with a virtual object. VR glove is basically providing input to the Virtual World as a Virtual Object controlled by user. From the many applications of Mixed reality, a game is the best application to efficiently display the use of VR Glove. Arduino Uno connected with Flex sensors and Accelerometer gives input to the Processing IDE which acts as translator to the game, and game components act as they are provided with Keyboard input. This implementation of VR glove can further be developed to interact with various Virtual realities and Augmented realities dynamically such that the user feels as if he or she is inside the VR or AR.
International Journal of Virtual World and Human Computer Interaction, 2015
Recently, we have seen an intensified development of head mounted displays (HMD). Some observers believe that the HMD form factor facilitates Augmented Reality (AR) technology, a technology that mixes virtual content with the users' view of the world around them. One of many interesting use cases that illustrate this is a smart home in which a user can interact with consumer electronic devices through a wearable AR system. Building prototypes of such wearable AR systems can be difficult and costly, since it involves a number of different devices and systems with varying technological readiness level. The ideal prototyping method for this should offer high fidelity at a relatively low cost and the ability to simulate a wide range of wearable AR use cases. This paper presents a proposed method, called IVAR (Immersive Virtual AR), for prototyping wearable AR interaction in a virtual environment (VE). IVAR was developed in an iterative design process that resulted in a testable setup in terms of hardware and software. Additionally, a basic pilot experiment was conducted to explore what it means to collect quantitative and qualitative data with the proposed prototyping method.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
2017 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video (3DTV-CON), 2017
Industrial Electronics, IEEE …, 2007
IEEE transactions on visualization and computer graphics, 2018
40th AIAA Aerospace Sciences Meeting & Exhibit, 2002
Human-Computer Interaction Series, 2009
lava.cs.st-andrews.ac.uk
Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, 2015
The Engineering Reality of Virtual Reality 2013, 2013
IEEE Computer Graphics and Applications, 2000
The Engineering of Mixed …, pp. 57-78, 2010
2012 17th International Conference on Computer Games (CGAMES), 2012
IEEE Pervasive Computing, 2000