Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97
We present a haptic rendering framework that allows the tactile display of complex virtual environments. This framework allows surface constraints, surface shading, friction, texture and other effects to be modeled solely by updating the position of a representative object, the "virtual proxy." This abstraction reduces the task of the haptic servo control loop to the minimization of the error between user's position and that of the proxy. This framework has been implemented in a system that is able to haptically render virtual environments of a complexity that is near and often in excess of the capabilities of current interactive graphic systems.
Proceedings of the 24th annual conference on Computer graphics and interactive techniques - SIGGRAPH '97, 1997
Force feedback coupled with visual display allows people to interact intuitiv ely with complex virtual environments. For this synergy of haptics and graphics to flourish, however, haptic systems must be capable of modeling environments with the same richness, complexity and interactivity that can be found in existing graphic systems. To help meet this challenge, we have developed a haptic rendering system that allows for the efficient tactile display of graphical information. The system uses a common high-level framework to model contact constraints, surface shading, friction and tex ture. The multilevel control system also helps ensure that the haptic device will remain stable even as the limits of the renderer's capabilities are reached.
2005
For a long time, human beings have dreamed of a virtual world where it is possible to interact with synthetic entities as if they were real. To date, the advances in computer graphics allow us to see virtual objects and avatars, to hear them, to move them, and to touch them. It has been shown that the ability to touch virtual objects increases the sense of presence in virtual environments [Insko 2001].
Journal of Robotic Systems, 2001
Haptics is an emerging technology that permits direct "hands-on" interaction with a virtual environment. A haptic device uses mechanical actuators to physically push a user's finger or hand to give the sensation that he or she would have when interacting with a real physical environment. These force feedback systems have many applications, from training a surgeon to perform an operation, to assisting a child in understanding the behavior of a lever or pulley. In this paper we discuss methods and techniques to allow realistic and robust haptic interactions between a human and a complex dynamic virtual environment. Beyond modeling object penetration constraints, this work demonstrates how shading, friction, texture, and dynamics can be generated to create compelling and realistic virtual worlds.
Proceedings of the 1995 symposium on Interactive 3D graphics - SI3D '95, 1995
Haptic rendering is the process of computing and generating forces in response to user interactions with virtual objects. Recent efforts by our team at MIT's AI laboratory have resulted in the development of haptic interface devices and algorithms for generating the forces of interaction with virtual objects. This paper focuses on the software techniques needed to generate sensations of contact interaction and material properties. In particular, the techniques we describe are appropriate for use with the Phantom haptic interface, a force generating display device developed in our laboratory. We also briefly describe a technique for representing and rendering the feel of arbitrary polyhedral shapes and address issues related to rendering the feel of non-homogeneous materials. A number of demonstrations of simple haptic tasks which combine our rendering techniques are also described.
We present a rigid-body simulation for multi-contact haptic interaction. The simulation is designed to make use of modern multiprocessor machines and the framework for this is discussed. An existing haptic rendering algorithm is extended to: facilitate simple implementation on a number of object types, enable the use of arbitrary objects as haptic cursors and allow multiple object contacts on the same haptic cursor. We also justify the use of hard-constraint based methods for rigid-body dynamics and discuss our implementation.
Teleoperators and Virtual Environments, 1999
Computer haptics, an emerging field of research that is analogous to computer graphics, is concerned with the generation and rendering of haptic virtual objects. In this paper, we propose an efficient haptic rendering method for displaying the feel of 3-D polyhedral objects in virtual environments (VEs). Using this method and a haptic interface device, the users can manually explore and feel the shape and surface details of virtual objects. The main component of our rendering method is the ''neighborhood watch'' algorithm that takes advantage of precomputed connectivity information for detecting collisions between the end effector of a force-reflecting robot and polyhedral objects in VEs. We use a hierarchical database, multithreading techniques, and efficient search procedures to reduce the computational time such that the haptic servo rate after the first contact is essentially independent of the number of polygons that represent the object. We also propose efficient methods for displaying surface properties of objects such as haptic texture and friction. Our haptic-texturing techniques and friction model can add surface details onto convex or concave 3-D polygonal surfaces. These haptic-rendering techniques can be extended to display dynamics of rigid and deformable objects.
2006
The development and evaluation of haptic rendering algorithms presents two unique challenges. Firstly, the haptic information channel is fundamentally bidirectional, so the output of a haptic environment is fundamentally dependent on user input, which is difficult to reliably reproduce. Additionally, it is difficult to compare haptic results to real-world, "gold standard" results, since such a comparison requires applying identical inputs to real and virtual objects and measuring the resulting forces, which requires hardware that is not widely available. We have addressed these challenges by building and releasing several sets of position and force information, collected by physically scanning a set of real-world objects, along with virtual models of those objects. We demonstrate novel applications of this data set for the development, debugging, optimization, evaluation, and comparison of haptic rendering algorithms.
IEEE Computer Graphics and Applications, 2004
I n the past decade we've seen an enormous increase in interest in the science of haptics. The quest for better understanding and use of haptic abilities (both human and nonhuman) has manifested itself in heightened activity in disciplines ranging from robotics and telerobotics; to computational geometry and computer graphics; to psychophysics, cognitive science, and the neurosciences. This issue of IEEE CG&A focuses on haptic rendering. Haptics broadly refers to touch interactions (physical contact) that occur for the purpose of perception or manipulation of objects. These interactions can be between a human hand and a real object; a robot end-effector and a real object; a human hand and a simulated object (via haptic interface devices); or a variety of combinations of human and machine interactions with real, remote, or virtual objects. Rendering refers to the process by which desired sensory stimuli are imposed on the user to convey information about a virtual haptic object. At the simplest level, this information is contained in the representation of the object's physical attributes-shape, elasticity, texture, mass, and so on. Just as a sphere visually rendered with simple shading techniques will look different from the same sphere rendered with ray-tracing techniques, a sphere haptically rendered with a simple penalty function will feel different from the same sphere rendered with techniques that also convey mechanical textures and surface friction. As in the days when people were astonished to see their first wire-frame computer-generated images, people are now astonished to feel their first virtual object. Yet the rendering techniques we use today will someday seem like yesterday's wire-frame displays-the first steps into a vast field. To help readers understand the issues discussed in this issue's theme articles, we briefly survey haptic systems and the techniques needed for rendering the way objects feel. We also discuss basic haptic-rendering algorithms that help us decide what force should be exerted and how we will deliver these forces to users. A sidebar discusses key points in the history of haptics.
2005
Abstract Most human-computer interactive systems focus primarily on the graphical rendering of visual information and, to a lesser extent, on the display of auditory information. Haptic interfaces have the potential to increase the quality of human-computer interaction by accommodating the sense of touch. They provide an attractive augmentation to visual display and enhance the level of understanding of complex data sets.
Computer Animation and Virtual Worlds, 2006
This paper describes our efforts in bringing haptics closer to current dynamic virtual environments (VE). These interactive 3D worlds make more and more use of physical simulations in order to increase realism. As a first step in closing the gap, we propose haptic travel that allows users to feel how their virtual representation navigates through the simulated world. In this work, we show how we coupled stable haptic rendering to physical simulation in order to achieve this. By generating a force feedback field, based on the user's input in combination with collision information provided by a rigid body simulator, we managed to provide the user with useful information on what is happening to its virtual representation. A humanoid animated character, which represents the user, is coupled to the rigid body object that represents the user in physical space. This character is animated according to the travel motions that the physical object makes, depending on user input from the haptic device. Our approach is suitable for a whole set of applications and input devices and can reduce the number of devices necessary to interact in VEs.
Proceedings of The Institution of Mechanical Engineers Part I-journal of Systems and Control Engineering, 2007
This paper presents a direct method for haptic rendering of a virtual object in which the object is represented as a virtual kinematic chain (virtual manipulator). The joint angles of the virtual manipulator (VM) are considered as parameters for the object's surface. The present algorithm is based on a closest-point approach that determines the joint angles (surface parameters) uniquely. The joint angles parameterize a point closest to the haptic device end-effector, and an impedance-type controller is designed for the haptic device that accounts for the haptic rendering algorithm. Within the control law, only the forces orthogonal to the object surface are rendered using the Jacobian of the VM, and the user feels a smooth surface whose stability (considering the coupled haptic device dynamics and closest-point algorithm kinematics) is guaranteed. Additional motion constraints on the virtual surface are also created by penalizing the joint angles of the VM, showing how this approach provides an efficient tool in designing a CAD (computer aided design) model.
IEEE Computer Graphics and Applications, 2004
New sophisticated haptic-rendering algorithms let users experience virtual objects through touch. We systematically investigate the unrealistic behavior of virtual haptic textures. The emerging science of haptic rendering consists of delivering properties of physical objects through the sense of touch. Owing to the recent development of sophisticated haptic-rendering algorithms, users can now experience virtual objects through touch in many exciting applications,
International Journal of Virtual Reality, 2008
Interaction techniques play a vital role in virtual environments' enrichment and have profound effects on the user's performance and sense of presence as well as realism of the Virtual Environment (VE). In this paper we present a new haptic guide model for object selection. It is utilized to augment the Follow-Me 3D interaction technique dedicated to object selection and manipulation. The fundamental concept of the Follow-Me technique is to divide VE into three different zones (free manipulation, visual and haptic assistance zones). Each one of the three zones is characterized by a specific interaction granularity which defines the properties of the interaction in the concerned zone. This splitting of VE is aimed to have both precision and assistance (zones of visual and haptic guidance) near the object to reach or to manipulate and to maintain a realistic and free interaction in the VE (free manipulation zone). The haptic and visual guides assist the user in object selection. The paper presents two different models of the haptic guides, one for free and multidirectional selection and the second for precise and single direction selection. The evaluation and comparison of these haptic guides are given and their effect on the user's performance in object selection in VE is investigated.
Proceedings 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. HAPTICS 2002, 2002
Soft objects are often desired in applications such as virtual surgery training. Soft object simulations are computationally intensive because object deformation involves numerically solving a large number of differential equations. However, realistic force feedback requires deformation be computed fast and graphic feedback requires deformation be highly detailed. In this paper, we propose an approach that balances these requirements by subdividing the area of interest on a relatively coarse mesh model. Thus we keep the number of nodes of the model under control so that the simulation can be run at a sufficiently high rate for force feedback. The model we use is based on a mass-spring model. When a portion of the surface is subdivided, new values of mass and spring constants are determined such that computed force feedback offers the user the same reaction force as before subdivision.
International Conference on Informatics in Control, Automation and Robotics, 2005
This paper presents a human-scale multi-modal virtual environment. User interacts with virtual worlds using a large-scale bimanual haptic interface called the SPIDAR-H. This interface is used to track user's hands movements and to display various aspects of force feedback associated mainly with contact, weight, and inertia. In order to increase the accuracy of the system, a calibration method is proposed.
Lecture Notes in Computer Science, 2006
In palpation procedure, medical doctors push and rub the organ's surface and they are provided the sensation of distributed pressure and contact force (reflecting force) for discerning doubtable portion. This paper suggests a real-time area-based haptic rendering model to describe distributed pressure and contact force simultaneously and present a haptic interface system to generate surface property in accordance with the haptic rendering algorithm. We represent the haptic model using the shape-retaining chain link (or S-Chain) framework for a fast and stable computation of the contact force and distributed pressure from a volumetric virtual object. In addition, we developed a compact pin-array type of tactile display unit and attached it to PHANToM TM haptic device to complement each other. In order to evaluate the performance of the proposed scheme, related experiments have been conducted with non-homogenous volumetric cubic objects consisting of approximately 500,000 volume elements at a haptic update rate of 1000 Hz. The experimental results show that compared to the point-contact the area-contact provides the users with more precise perception of the shape and softness of the object's composition, and that our proposed system satisfies the real-time and realism constraints to be useful for virtual reality applications.
11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings., 2003
Haptic rendering complex objects in virtual environments is computationally intensive. In this paper we start the investigation of a new category of approach to reducing the computation in haptic rendering. Our approach is based on the hypothesis that the accuracy of haptic perception might be limited. Results of the experiments described in this paper suggest that subjects might not be able to distinguish two haptic objects if they are beyond some refinement level. This limitation of haptic perception may be taken advantage of in haptic rendering by replacing a fine object with a coarser object to reduce scene complexity.
Advanced Robotics, 2007
Although people usually contact a surface with some area rather than a point, most haptic devices allow a user to interact with a virtual object at one point at a time and likewise most haptic rendering algorithms deal with such situations only. In a palpation procedure, medical doctors push and rub the organ's surface, and are provided the sensation of distributed pressure and contact force (reflecting force) for discerning doubtable areas of the organ. In this paper, we suggest real-time area-based haptic rendering to describe distributed pressure and contact force simultaneously, and present a haptic interface system to generate surface properties in accordance with the haptic rendering algorithm. We represent the haptic model using the shape-retaining chain link (S-chain) framework for a fast and stable computation of the contact force and distributed pressure from a volumetric virtual object. In addition, we developed a compact pin-array-type tactile display unit and attached it to the PHANToM TM haptic device to complement each other. For the evaluation, experiments were conducted with non-homogenous volumetric cubic objects consisting of approximately 500 000 volume elements. The experimental results show that compared to the point contact, the area contact provides the user with more precise perception of the shape and softness of the object's composition, and that our proposed system satisfies the real-time and realism constraints to be useful for a virtual reality application.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.