Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1998, IEEE Computer Graphics and Applications
…
8 pages
1 file
Wearable computers represent a significant advance in portable computing, providing constant access to various applications. However, effective information presentation remains a challenge due to the unique nature of wearable interfaces, which often rely on head-mounted displays (HMDs). This work explores spatial information displays, highlighting strategies like head-stabilized, body-stabilized, and world-stabilized information that improve user interaction. Results indicate that users find spatial displays more intuitive and perform better in search tasks, underscoring the potential of augmented reality interfaces in wearable computing.
Proceedings of the 5th Augmented Human International Conference on - AH '14, 2014
With head-mounted displays becoming more ubiquitous, the vision of extending human object search capabilities using a wearable system becomes feasible. Wearable cameras can recognize known objects and store their indoor location. But how can the location of objects be represented on a wearable device like Google Glass and how can the user be navigated towards the object? We implemented a prototype on a wearable computer with a head-mounted display and compared a last seen image representation against a map representation of the location. We found a significant interaction effect favoring the last seen image with harder hidden objects. Additionally, all objective and subjective measures generally favor the last seen image. Results suggest that a map representation is more helpful for gross navigation and an image representation is more supportive for fine navigation.
Virtual Reality Annual …, 1998
1997
Wearable computing moves computation from the desktop to the user. We are forming a community of networked wearable computer users to explore, over a long period, the augmented realities that these systems can provide. By adapting its behavior to the user's changing environment, a body-worn computer can assist the user more intelligently, consistently, and continuously than a desktop system. A text-based augmented reality, the Remembrance Agent, is presented to illustrate this approach. Video cameras are used both to warp the visual input (mediated reality) and to sense the user's world for graphical overlay.
Proceedings of the 5th Augmented Human International Conference on - AH '14, 2014
When wearing a head mounted display (HMD), the degree of concentration on the HMD varies depending on the surrounding environment. In this work, we developed an information presentation method considering cognitive cost and safety in wearable computing environments. The proposed method changes its information presentation method based on possible gazing time that varies in according with the surrounding environment and user context. We used an eye tracker to measure the relationship between eyestrain and watching an HMD and also clarified the relationship between gaze time and surrounding environment. We then used the results to develop an algorithm to change the information presentation method. Evaluation results revealed cases in which it was difficult and dangerous to gaze at an HMD and therefore necessary to change the information presentation.
2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2019
Fig. 1: The novel head-worn vibration feedback mechanism attached to the Microsoft HoloLens. In our studies, we used five vibration motors touching the forehead and temples. Three further vibration elements can be attached to the back of the head. The left and middle image show the arrangement of the used vibration elements, the right image a close up of the mount that was optimized for vibration feedback through a custom-build flexible mechanism that comfortably presses the vibration motor to the head for optimal skin contact. Abstract-Head-worn devices with a narrow field of view are common commodity for Augmented Reality. However, their limited screen space makes view management difficult. Especially in dense information spaces this potentially leads to visual conflicts such as overlapping labels (occlusion) and visual clutter. In this paper, we look into the potential of using audio and vibrotactile feedback to guide search and information localization. Our results indicate users can be guided with high accuracy using audio-tactile feedback with maximum median deviations of only 2°on longitude, 3.6°on latitude and 0.07 meter in depth. Regarding the encoding of latitude we found a superior performance when using audio, resulting in an improvement of 61% and fastest search times. When interpreting localization cues the maximum median deviation was 9.9°on longitude and 18% of a selected distance to be encoded which could be reduced to 14% when using audio.
There is currently no human-computer interface specifically designed for the alternative interaction modes used on wearable systems for input and output. For a geographically oriented wearable computer system a graphical user interface (GUI) is indispensable, yet contemporary window systems using the desktop metaphor are unsuitable for wearable computing. In this paper we document and discuss the design and implementation of a prototype cartographic user interface for wearable computing focused on cartographic tasks, particularly navigation in the field. This system can accommodate multiple sources of spatial information including maps, imagery and thematic data collected in the field, and exploits the two most basic advantages of wearable devices, real-time visualization and multi-modal interaction.
Wearable Computers, …, 1998
Wearable computers provide constant access to computing and communications resources. In this paper we describe how the computing power of wearables can be used to provide spatialized 3D graphics and audio cues to aid communication. The result is a wearable augmented reality communication space with audio enabled avatars of the remote collaborators surrounding the user. The user can use natural head motions to attend to the remote collaborators, can communicate freely while being aware of other side conversations and can move through the communication space. In this way the conferencing space can support dozens of simultaneous users. Informal user studies suggest that wearable communication spaces may offer several advantages, both through the increase in the amount of information it is possible to access and the naturalness of the interface.
Wide-view HMDs allow users to use their peripheral vision in a wearable augmented reality (AR) system. However, there have been few studies on information display methods that make use of a wide FOV in AR environments. In this study, we discuss and implement types of information display methods that make good use of a few types of wide FOV HMDs, and then we evaluate the effectiveness of these methods through a user study with real and virtual wide FOV see-through HMDs. In one of these methods, an annotation is presented near the border of the HMD's view with a lead line connected to the annotated object (outside of the HMD's viewing plane). An immersive CAVE-like environment is used to simulate a virtual see-through HMD as well as a virtual walk-through environment, and a prototype of a hyperboloidal head mounted projective display (HHMPD) is used as a real see-through HMD. Experimental results show that the methods with lead lines improve target discovery rates compared to a method that overlays annotations directly over the target's position. Additionally, it was shown that annotations with an added blinking effect have little effect on target discovery rates regardless of viewing angle.
Optics and Photonics News, 2009
Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, 2020
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
… : Teleoperators & Virtual …, 2002
Journal of Management Information Systems, 2007
IEEE Transactions on Visualization and Computer Graphics
Lecture Notes in Computer Science, 2001
EUROCON 2005 - The International Conference on "Computer as a Tool", 2005
Journal of Advances in Information Technology, 2010
Personal Technologies, 1999
Aslib Proceedings, 2007
Helmet- and Head-Mounted Displays IX: Technologies and Applications, 2004
Virtual Reality, 2011