Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2009
…
2 pages
1 file
Abstract We introduce a novel touch-based interaction technique for tangible user interfaces (TUIs) in Augmented Reality (AR) applications. The technique allows for direct access and manipulation of virtual content on a registered tracking target, is robust and lightweight, and can be applied in numerous tracking and interaction scenarios.
2006
Augmented Reality (AR) and Tangible User Interface (TUI) have been proven to provide intuition to human computer interface with richness of a tactile sense. Recent implementations in that field, however, have inherited a 2D Graphical User Interface (GUI) scheme and work as isolated systems. Consequently, the systems are limited in supporting intuitive interfaces and various applications. This paper presents an AR-based tangible interaction system for table-top interaction environments.
… on Ubiquitous Virtual …, 2010
In this work we evaluated the usability of tangible user interaction for traditional desktop augmented reality environments. More specifically, we compared physical sliders and tracked paddles, and traditional mouse input for a system control task. While task accuracy was the same for all interfaces, mouse input performed the fastest and input with a tracked paddle the slowest. Performance with the physical sliders fell between those two. We present these results along with various findings from user comments, and discuss how they may influence the design of future desktop AR systems.
2005
ABSTRACT Augmented Reality (AR) and Tangible User Interface (TUI) have been proven to provide intuition to human computer interface with richness of a tactile sense. Current implementations of the fields, however, have inherited 2D Graphical User Interface (GUI) scheme and worked as isolated systems. Consequently, the systems don't support an intuitive interface and limit fusibility of them. This paper presents an AR-based tangible interaction system for a table-top interaction environment which exploits user context.
Human-Computer Interaction Series, 2009
In this chapter, we discuss the design of tangible interaction techniques for Mixed Reality environments. We begin by recalling some conceptual models of tangible interaction. Then, we propose an engineering-oriented software/hardware co-design process, based on our experience in developing tangible user interfaces. We present three different tangible user interfaces for real-world applications, and analyse the feedback from the user studies that we conducted. In summary, we conclude that, since tangible user interfaces are part of the real world and provide a seamless interaction with virtual words, they are well-adapted to mix together reality and virtuality. Hence, tangible interaction optimizes a users' virtual tasks, especially in manipulating and controlling 3D digital data in 3D space.
Proceedings of the 2nd ACM symposium on Spatial user interaction, 2014
Nowadays, handheld devices are capable of displaying augmented environments in which virtual content overlaps reality. To interact with these environments it is necessary to use a manipulation technique. The objective of a manipulation technique is to define how the input data modify the properties of the virtual objects. Current devices have multi-touch screens that can serve as input. Additionally, the position and rotation of the device can also be used as input creating both an opportunity and a design challenge. In this paper we compared three manipulation techniques which namely employ multi-touch, device position and a combination of both. A user evaluation on a docking task revealed that combining multi- touch and device movement yields the best task completion time and efficiency. Nevertheless, using only the device movement and orientation is more intuitive and performs worse only in large rotations.
Proceedings of the 2nd …, 2004
This paper presents a technique for natural, fingertip-based interaction with virtual objects in Augmented Reality (AR) environments. We use image processing software and finger-and hand-based fiducial markers to track gestures from the user, stencil buffering to enable the user to see their fingers at all times, and fingertip-based haptic feedback devices to enable the user to feel virtual objects. Unlike previous AR interfaces, this approach allows users to interact with virtual content using natural hand gestures. The paper describes how these techniques were applied in an urban planning interface, and also presents preliminary informal usability results.
Augmented Reality, 2001. …, 2001
Optical tracking systems allow three-dimensional input for virtual environment applications with high precision and without annoying cables. Spontaneous and intuitive interaction is possible through gestures. In this paper, we present a finger tracker that allows gestural interaction and is simple, cheap, fast, robust against occlusion and accurate. It is based on a marked glove, a stereoscopic tracking system and a kinematic 3-d model of the human finger. Within our augmented reality application scenario, the user is able to grab, translate, rotate, and release objects in an intuitive way. We demonstrate our tracking system in an augmented reality chess game allowing a user to interact with virtual objects.
2002
This paper describes a novel use of augmented reality for the visualisation of virtual objects as part of the move towards pervasive computing. It uses fiducial markers as switches to "toggle" the displayed properties of the virtual objects. Using collision detection, fiducial markers are also used to track and select nodes within virtual objects. This research uses the ARToolkit Version 2.33 and acts as a component within the DSTO's InVision framework. ⋅
Recent advances in mobile computing and augmented reality (AR) technology have lead to popularization of mobile AR applications. Touch screen interfaces are common in mobile devices, and are also widely used in AR applications running on mobile devices, such as smartphones. However, due to unsteady camera view movement in handheld AR environment, it is hard to carry out precise interactions, such as drawing, especially when tracing physical objects. In this paper, we investigate two types of interaction techniques, Freeze-Set-Go and Snap-To-Feature, that help users to perform more accurate touch screen based AR interactions. The two techniques are compared in a user experiment with a task of tracing physical objects, which can be encountered when making annotation on or modeling physical objects within the AR scene. The results from the experiment show that a combination of these two makes a significant difference in accuracy and usability of touch screen based AR interaction.
… . Ambient, Ubiquitous and …, 2009
Ambient Interface research has the goal of embedding technology that disappears into the user's surroundings. In many ways Augmented Reality (AR) technology is complimentary to this in that AR interfaces seamlessly enhances the real environment with virtual information overlay. The two merge together in context aware Ambient AR applications, which allow users to easily perceive and interact with Ambient Interfaces by using AR overlay of the real world. In this paper we describe how Tangible Interaction techniques can be used for Ambient AR applications. We will present a conceptual framework for Ambient Tangible AR Interface, a new generation of software and hardware tools for development and methods for evaluating Ambient Tangible AR Interfaces.
Proceedings of Mensch und Computer 2019
… Reality, 2000.(ISAR …, 2000
MobiQuitous 2020 - 17th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services, 2020
Proc. Informatik, 2006
2014 4th International Conference on Image Processing Theory, Tools and Applications (IPTA), 2014
IEEE Pervasive Computing, 2008
Augmented Reality, 2010
Journal of Computer Science and Technology, 2012
Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry - VRCAI '12, 2012