Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2008, Bildverarbeitung für die Medizin 2008
In this paper we propose a software-based approach to simulate the haptics of the human pulse. This effect can then be interactively explored by virtual palpation. The algorithm features a flexible, user-defined setup. In addition the simulation takes dynamic parameters into account and allows for many different use cases.
Palpation is a physical examination technique where objects, e.g., organs or body parts, are touched with fingers to determine their size, shape, consistency and location. Many medical procedures utilize palpation as a supplementary interaction technique and it can be therefore considered as an essential basic method. However, palpation is mostly neglected in medical training simulators, with the exception of very specialized simulators that solely focus on palpation, e.g., for manual cancer detection. In this article we propose a novel approach to enable haptic palpation interaction for virtual reality–based medical simulators. The main contribution is an extensive user study conducted with a large group of medical experts. To provide a plausible simulation framework for this user study, we contribute a novel and detailed interaction algorithm for palpation with tissue dragging, which utilizes a multi–object force algorithm to support multiple layers of anatomy and a pulse force algorithm for simulation of an arterial pulse. Furthermore, we propose a modification for an off–the–shelf haptic device by adding a lightweight palpation pad to support a more realistic finger grip configuration for palpation tasks. The user study itself has been conducted on a medical training simulator prototype with a specific procedure from regional anesthesia, which strongly depends on palpation. The prototype utilizes a co–rotational finite element approach for soft tissue simulation and provides bimanual interaction by combining the aforementioned techniques with needle insertion for the other hand. The results of the user study suggest reasonable face validity of the simulator prototype and in particular validate medical plausibility of the proposed palpation interaction algorithm.
Lecture Notes in Computer Science, 2006
In palpation procedure, medical doctors push and rub the organ's surface and they are provided the sensation of distributed pressure and contact force (reflecting force) for discerning doubtable portion. This paper suggests a real-time area-based haptic rendering model to describe distributed pressure and contact force simultaneously and present a haptic interface system to generate surface property in accordance with the haptic rendering algorithm. We represent the haptic model using the shape-retaining chain link (or S-Chain) framework for a fast and stable computation of the contact force and distributed pressure from a volumetric virtual object. In addition, we developed a compact pin-array type of tactile display unit and attached it to PHANToM TM haptic device to complement each other. In order to evaluate the performance of the proposed scheme, related experiments have been conducted with non-homogenous volumetric cubic objects consisting of approximately 500,000 volume elements at a haptic update rate of 1000 Hz. The experimental results show that compared to the point-contact the area-contact provides the users with more precise perception of the shape and softness of the object's composition, and that our proposed system satisfies the real-time and realism constraints to be useful for virtual reality applications.
Advanced Robotics, 2007
Although people usually contact a surface with some area rather than a point, most haptic devices allow a user to interact with a virtual object at one point at a time and likewise most haptic rendering algorithms deal with such situations only. In a palpation procedure, medical doctors push and rub the organ's surface, and are provided the sensation of distributed pressure and contact force (reflecting force) for discerning doubtable areas of the organ. In this paper, we suggest real-time area-based haptic rendering to describe distributed pressure and contact force simultaneously, and present a haptic interface system to generate surface properties in accordance with the haptic rendering algorithm. We represent the haptic model using the shape-retaining chain link (S-chain) framework for a fast and stable computation of the contact force and distributed pressure from a volumetric virtual object. In addition, we developed a compact pin-array-type tactile display unit and attached it to the PHANToM TM haptic device to complement each other. For the evaluation, experiments were conducted with non-homogenous volumetric cubic objects consisting of approximately 500 000 volume elements. The experimental results show that compared to the point contact, the area contact provides the user with more precise perception of the shape and softness of the object's composition, and that our proposed system satisfies the real-time and realism constraints to be useful for a virtual reality application.
Frontiers in Robotics and AI, 2018
This paper presents the development of a wearable Fingertip Haptic Device (FHD) that can provide cutaneous feedback via a Variable Compliance Platform (VCP). The FHD includes an inertial measurement unit, which tracks the motion of the user's finger while its haptic functionality relies on two parameters: pressure in the VCP and its linear displacement towards the fingertip. The combination of these two features results in various conditions of the FHD, which emulate the remote object or surface stiffness properties. Such a device can be used in tele-operation, including virtual reality applications, where rendering the level of stiffness of different physical or virtual materials could provide a more realistic haptic perception to the user. The FHD stiffness representation is characterised in terms of resulting pressure and force applied to the fingertip created through the relationship of the two functional parameters-pressure and displacement of the VCP. The FHD was tested in a series of user studies to assess its potential to create a user perception of the object's variable stiffness. The viability of the FHD as a haptic device has been further confirmed by interfacing the users with a virtual environment. The developed virtual environment task required the users to follow a virtual path, identify objects of different hardness on the path and navigate away from "no-go" zones. The task was performed with and without the use of the variable compliance on the FHD. The results showed improved performance with the presence of the variable compliance provided by the FHD in all assessed categories and particularly in the ability to identify correctly between objects of different hardness.
Journal of Information Processing Systems, 2012
This paper presents a dual modeling method that simulates the graphic and haptic behavior of a volumetric deformable object and conveys the behavior to a human operator. Although conventional modeling methods (a mass-spring model and a finite element method) are suitable for the real-time computation of an object's deformation, it is not easy to compute the haptic behavior of a volumetric deformable object with the conventional modeling method in real-time (within a 1kHz) due to a computational burden. Previously, we proposed a fast volume haptic rendering method based on the S-chain model that can compute the deformation of a volumetric non-rigid object and its haptic feedback in real-time. When the S-chain model represents the object, the haptic feeling is realistic, whereas the graphical results of the deformed shape look linear. In order to improve the graphic and haptic behavior at the same time, we propose a dual modeling framework in which a volumetric haptic model and a surface graphical model coexist. In order to inspect the graphic and haptic behavior of objects represented by the proposed dual model, experiments are conducted with volumetric objects consisting of about 20,000 nodes at a haptic update rate of 1000Hz and a graphic update rate of 30Hz. We also conduct human factor studies to show that the haptic and graphic behavior from our model is realistic. Our experiments verify that our model provides a realistic haptic and graphic feeling to users in real-time.
Frontiers in Robotics and AI, 2022
This paper explores methods that make use of visual cues aimed at generating actual haptic sensation to the user, namely pseudo-haptics. We propose a new pseudo-haptic feedback-based method capable of conveying 3D haptic information and combining visual haptics with force feedback to enhance the user’s haptic experience. We focused on an application related to tumor identification during palpation and evaluated the proposed method in an experimental study where users interacted with a haptic device and graphical interface while exploring a virtual model of soft tissue, which represented stiffness distribution of a silicone phantom tissue with embedded hard inclusions. The performance of hard inclusion detection using force feedback only, pseudo-haptic feedback only, and the combination of the two feedbacks was compared with the direct hand touch. The combination method and direct hand touch had no significant difference in the detection results. Compared with the force feedback alone, our method increased the sensitivity by 5 %, the positive predictive value by 4 %, and decreased detection time by 48.7 %. The proposed methodology has great potential for robot-assisted minimally invasive surgery and in all applications where remote haptic feedback is needed.
2020 American Control Conference (ACC), 2020
International journal of engineering research and technology, 2018
Haptic Interface are designed to allow humans to touch virtual objects as they were real. Unfortunately, virtual surface models currently require extensive hand tuning and do not feel authentic, which limits the usefulness and applicability of such systems. The proposed approach of Haptography seeks to address this deficiency by basing models on haptic data recorded from real interactions between a human and a target object. The studio Haptographer uses a fully instrumented stylus to tap, press and stroke an item in a controlled environment while a computer system records position, orientation, velocities, accelerations and forces. The point and touch Haptographer carries a simple instrumented Stylus around during daily life, using it to capture interesting haptic properties of items in the real world. Recorded data is distilled into Haptographer, the Haptic impressing of object or surface patch, including properties such as local shape, stiffness, friction and texture. Finally the ...
This study presents a haptic simulator for learning palpation of aorta in cardiovascular surgery and performs quantitative evaluation in educational use through some user study. The simulator implements physics-based methods that enable VR simulation of soft organs with autonomous motion like a beating heart or an aorta. The developed model simulates real-time deformation and force feedback during surgical palpation based on finite element methods. The evaluation with cardiovascular surgeons and medical students confirms the developed system is useful for learning not only surgical procedures but also stiffness of in-vivo organs.
During open surgery, surgeons can perceive the locations of tumors inside soft-tissue organs using their fingers. Palpating an organ, surgeons acquire distributed pressure (tactile) information that can be interpreted as stiffness distribution across the organan important aid in detecting buried tumors in otherwise healthy tissue. Previous research has focused on haptic systems to feedback the tactile sensation experienced during palpation to the surgeon during minimally invasive. However, the control complexity and high cost of tactile actuators limits its current application. This paper describes a pneumatic multi-fingered haptic feedback system for robotassisted minimally invasive surgery. It simulates soft tissue stiffness by changing the pressure of an air balloon and recreates the deformation of fingers as experienced during palpation. The pneumatic haptic feedback actuator is validated by using finite element analysis. The results prove that the interaction stress between the fingertip and the soft tissue as well as the deformation of fingertips during palpation can be recreated by using our pneumatic multi-fingered haptic feedback method.
Springer Tracts in Advanced Robotics
A method for real time simulation and interaction of deformable objects in medical simulators is proposed. We are interested in applications for training surgeons using haptic interaction. For haptic purposes, our medical simulator is based on a dual model architecture; simulation and haptics. We currently use a new physical model LEM-Long Element Method as the simulation model. We find that this model can produce satisfactory global changes for small and large deformations. In this paper, we will focus on implementing an haptic interaction method with stable and realistic force feedback designed for use with LEM. A deformable buffer model is used to solve problems arising from the difference between sampling and update rates. We look into the construction and the updating process of this buffer model. Our approach to linking the two models to get realistic force feedback is also presented. The physical and haptic model are then coupled to be part of a surgical simulator for soft tissue. We present some results from our prototype medical simulator for echography exams of the human thigh.
Actuators, 2022
The simulation of fabrics physics and its interaction with the human body has been largely studied in recent years to provide realistic-looking garments and wears specifically in the entertainment business. When the purpose of the simulation is to obtain scientific measures and detailed mechanical properties of the interaction, the underlying physical models should be enhanced to obtain better simulation accuracy increasing the modeling complexity and relaxing the simulation timing constraints to properly solve the set of equations under analysis. However, in the specific field of haptic interaction, the desiderata are to have both physical consistency and high frame rate to display stable and coherent stimuli as feedback to the user requiring a tradeoff between accuracy and real-time interaction. This work introduces a haptic system for the evaluation of the fabric hand of specific garments either existing or yet to be produced in a virtual reality simulation. The modeling is based...
2007
This paper presents a real-time physically based platform for multi-sensory interactive simulation. This platform is centered on high quality dynamic requirements driven by the concept of instrumental interaction. It is oriented towards the simulation of visà-vis human-object interactive simulation for a broad range of physical phenomena, with a specific focus on simulations with demanding capacities regarding dynamics such as tool use, object manipulation, music instrument playing, etc.
EuroHaptics'2008: 6th International Conference, 2008
In this paper we present an interactive dynamic simulator for virtual avatars. It allows creation and manipulation of objects in a collaborative way by virtual avatars or between virtual avatars and users. The users interact with the simulation environment using a haptic probe which provides force feedback. This dynamic simulator uses fast dynamics computation and constraint-based methods with friction. It is part of a general framework that is being devised for studies of collaborative scenarios with haptic feedback.
Signals and Systems in Biomedical Engineering: Physiological Systems Modeling and Signal Processing, 2019
That virtual reality is possible is an important fact about the fabric of reality. It is the basis not only of computation, but of human imagination and external experience, science and mathematics, art and fiction.-David Deutsch The availability of cheap computing power makes computational models available easily to physiologists. Modern computers have not only good computational capabilities but also very good graphical displays, thereby making the output of models convenient for non-mathematical users. Graphical presentation itself uses visual analogy for physical behavior. Since modern computers are all discrete numerical machines, while physiological systems are fundamentally continuous, some approximations are required in order to use discrete modeling for continuoustime systems. The issue of discretization has been dealt with in some detail in Chap. 4, and a number of digital techniques for the analysis of signals and systems have been discussed in Chap. 5. In this chapter, we look at some geometric and animation techniques for representing physiological models. We also introduce haptics which can impart a tactile component to the models. These computational models with graphical display, audio, and haptics make possible virtual experiments for physiological exploration. 6.1 Numerical Methods for Solving Equations In the early days of computational models, differential equations were solved using analog computers. The analog computers were electronic circuits whose behavior mimicked that of the system being modeled. Modern computer models use digital computers to solve the system equations. Contemporary digital computers are
Proceedings of the second international workshop on Smart material interfaces: another step to a material future - SMI '13, 2013
We present the initial exploration of using ForceForm, a dynamically deformable interactive surface, for an application in the medical domain. ForceForm provides direct dynamic interaction which is soft and malleable. We are interested in pursuing its use as a training tool in medical scenarios which involve the direct interaction with human skin. As an example of this, we have developed a palpation training application. Previous work in this area uses haptic devices which do not have the soft and direct interaction exhibited by ForceForm. This workshop paper details our palpation application and a discussion of the findings of an expert user consultation involving a doctor and a massage therapist.
Proceedings of the 1995 symposium on Interactive 3D graphics - SI3D '95, 1995
Haptic rendering is the process of computing and generating forces in response to user interactions with virtual objects. Recent efforts by our team at MIT's AI laboratory have resulted in the development of haptic interface devices and algorithms for generating the forces of interaction with virtual objects. This paper focuses on the software techniques needed to generate sensations of contact interaction and material properties. In particular, the techniques we describe are appropriate for use with the Phantom haptic interface, a force generating display device developed in our laboratory. We also briefly describe a technique for representing and rendering the feel of arbitrary polyhedral shapes and address issues related to rendering the feel of non-homogeneous materials. A number of demonstrations of simple haptic tasks which combine our rendering techniques are also described.
The work in the Touch Lab (formal name: Laboratory for Human and Machine Haptics) is guided by a broad vision of haptics which includes all aspects of information acquisition and object manipulation through touch by humans, machines, or a combination of the two; and the environments can be real or virtual. We conduct research in multiple disciplines such as skin biomechanics, tactile neuroscience, human haptic perception, robot design and control, mathematical modeling and simulation, and software engineering for real-time human-computer interactions. These scientific and technological research areas converge in the context of specific application areas such as the development of virtual reality based simulators for training surgeons, haptic aids for people who are blind, real-time haptic interactions between people across the Internet, and direct control of machines from neural signals in the brain.
ACM SIGGRAPH 2005 Courses on - SIGGRAPH '05, 2005
For a long time, human beings have dreamed of a virtual world where it is possible to interact with synthetic entities as if they were real. To date, the advances in computer graphics allow us to see virtual objects and avatars, to hear them, to move them, and to touch them. It has been shown that the ability to touch virtual objects increases the sense of presence in virtual environments .
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.