Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2013, Proceedings of the ACM Symposium on Applied Perception - SAP '13
https://doi.org/10.1145/2492494.2501882…
7 pages
1 file
Mobile and wearable embedded devices connect the user with digital information in a continuous and pervasive way. The need for portability and unobtrusiveness limits the possibilities of user interaction with such devices, challenging the designer to exploit new input and output modalities. A key benefit is given by the possibility to use multi-modal interaction capabilities that can dynamically act on different human senses and the cooperative capabilities of the small and pervasive devices. In this scenario we present HapticLib, a software library for the development and implementation of vibro-tactile feedback on resource-constrained embedded devices. It was designed to offer a high level programming interface for the rendering of haptic patterns, accurately modeling the nature of vibro-tactile actuators and different touch experiences.
Proceedings of the 2019 ACM Southeast Conference, 2019
Over the past decade, the advancements in force-feedback (haptic) systems, facilitated the inclusion of the tactile communication channel in a variety of user interfaces. Tactile sensors are distributed over the entire human body, hence a diversity of haptic hardware configurations are possible. The applications span from: force-feedback systems-conveying large forces, to vibrotactile systems-conveying smaller forces to the human sensory system. This paper provides a comprehensive survey of state-of-the-art in force-feedback and vibrotactile hardware with references to associated software. The main application domains, several prominent applications, as well as significant research efforts are highlighted. Additionally the survey defines the terms and the paradigms used in the haptic technology domain. CCS CONCEPTS • Human-centered computing → Haptic devices • Hardware → Tactile and hand-based interfaces
2011
In the real world, touch based interaction relies on haptic feedback (e.g., grasping objects, feeling textures). Unfortunately, such feedback is absent in current tabletop systems. The previously developed Haptic Tabletop Puck (HTP) aims at supporting experimentation with and development of inexpensive tabletop haptic interfaces in a do-it-yourself fashion. The problem is that programming the HTP (and haptics in general) is difficult. To address this problem, we contribute the HAPTICTOUCH toolkit, which enables developers to rapidly prototype haptic tabletop applications. Our toolkit is structured in three layers that enable programmers to: (1) directly control the device, (2) create customized combinable haptic behaviors (e.g., softness, oscillation), and (3) use visuals (e.g., shapes, images, buttons) to quickly make use of these behaviors. In our preliminary exploration we found that programmers could use our toolkit to create haptic tabletop applications in a short amount of time.
The 34th Annual ACM Symposium on User Interface Software and Technology, 2021
Wearable vibrotactile devices have many potential applications, including sensory substitution for accessibility and notifcations. Currently, vibrotactile experimentation is done using large lab setups. However, most practical applications require standalone on-body devices and integration into small form factors. Such integration is time-consuming and requires expertise. With a goal to democratize wearable haptics we introduce VHP, a vibrotactile haptics platform. It includes a low-power miniature electronics board that can drive up to 12 independent channels of haptic signals with arbitrary waveforms at a 2 kHz sampling rate. The platform can drive vibrotactile actuators including linear resonant actuators and voice coils. The control hardware is battery-powered and programmable, and has multiple input options, including serial and Bluetooth, as well as the ability to synthesize haptic signals internally. We developed current-based loading sensing, thus allowing for unique features such as actuator auto-classifcation, and skin-contact quality sensing. Our technical evaluations showed that the system met all our initial design criteria and is an improvement over prior methods as it allows all-day wear, has low latency, has battery life between 3 and 25 hours, and can run 12 actuators simultaneously. We demonstrate unique applications that would be timeconsuming to develop without the VHP platform. We show that VHP can be used as bracelet, sleeve and phone-case form factors. The bracelet was programmed with an audio-to-tactile interface and was successfully worn for multiple days over months by developers. To facilitate more use of this platform, we open-source our design and plan to make the hardware widely available. We hope this work will motivate the use and study of vibrotactile all-day wearable devices. CCS CONCEPTS • Human-centered computing → Haptic devices. This work is licensed under a Creative Commons Attribution International 4.0 License.
IEEE Transactions on Haptics, 2017
We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.
Journal of Ambient Intelligence and Smart Environments, 2009
Haptic interaction has for a long time been a promise that has not fully been realized in everyday technology due to several reasons. Already for more than 20 years the research community in the field of human-technology interaction has identified multimodal interaction as a potential next mainstream interaction paradigm to replace graphical user interfaces. At the same time, both personal computers and mobile devices have developed rapidly allowing more computing power, more sophisticated feedback through different channels such as display and audio, and more ways of interaction to be used in everyday computing tasks. Within the past few years, haptic interaction has been under rapid research and development. In this article, we will give an introduction to the present state of the art in haptic interaction technology and its promises in mainstream information and communication technology.
Human Computer Interaction, 2008
— Touchscreens have been invading mobile devices all over the world and they are replacing traditional user interfaces. Since the users are unaware of the mechanical button feedback of touchscreen, which the user can feel while operating the device. The major disadvantage of touch screens are no physical or mechanical feedback when the touchscreen is pressed or an event occurs and even absence of haptics feedback. Haptic technology is the future of touch interface, which allows the user to not only touch the screen but also feel the texture on their fingertips. The proposed project is aimed to generate more delicate haptics sensations on the touch panel, various piezo actuators were incorporated into mobile devices. Piezo actuator generates haptics sensations for various UI (User Interface) tools and touch events triggered. This paper explains about the design and development of haptics solutions for touch sensitive mobile devices at Motorola solutions. This involves MPA 2.0 BigBoard, piezo actuator and DRV8662EVM haptic driver modules.
12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS '04. Proceedings., 2004
This paper describes the development of a generic framework for implementing realistic cross-platform haptic virtual reality application, using the Sensable's PHANToM haptic device. Here the term cross-platform refers to two of the most popular operating systems: Windows and Linux. Currently available free-fornoncommercial-use Software Development Kits (SDKs) deal with a single Haptic Interaction Point (HIP), i.e. the tip of the haptic probe. However, many applications as path planning, virtual assembly, medical or dental simulations, as well as scientific exploration require object-object interactions, meaning any part of the complex 3D object attached to the probe collides with the other objects in the virtual scene. Collision detections and penetration depths between 3D objects must be quickly computed to generate forces to be displayed by the haptic device. In these circumstances, implementation of haptic applications is very challenging when the numbers, stiffness and/or complexity of objects in the scene are considerable, mainly because of high update rates needed to avoid instabilities of the system. The proposed framework meets this high requirement and provides a high-performance test bed for further research in algorithms for collision detection and generation of haptic forces.
2004
We demonstrate a 30-element vibrotactile array that fits the palm of a large-handed user. The array is driven by input to a touchpad, thereby allowing one user to haptically "draw" on a remote user's hand. Pulse-width modulation is used to control tactor intensity, and the multiple intensity levels are used by an anti-aliasing procedure that allows the array to represent
2004
This paper presents work we have done on the design and implementation of an untethered system to deliver haptic cues for use in immersive virtual environments through a body-worn garment. Our system can control a large number of body-worn vibration units, each with individually controllable vibration intensity. Several design iterations have helped us to refine the system and improve such aspects as robustness, ease of donning and doffing, weight, power consumption, cable management, and support for many different types of feedback units, such as pager motors, solenoids, and muffin fans. In addition, experience integrating the system into an advanced virtual reality system has helped define some of the design constraints for creating wearable solutions, and to further refine our implementation.
While a standard approach is more or less established for rendering basic vibratory cues in consumer electronics, the implementation of advanced vibrotac-tile feedback still requires designers and engineers to solve a number of technical issues. Several off-the-shelf vibration actuators are currently available, having different characteristics and limitations that should be considered in the design process. We suggest an iterative approach to design in which vibrotactile interfaces are validated by testing their accuracy in rendering vibratory cues and in measuring input gestures. Several examples of prototype interfaces yielding audio-haptic feedback are described, ranging from open-ended devices to musical interfaces, addressing their design and the characterization of their vibratory output.
Smartwatches now allow information to be conveniently accessed directly from the user’s wrist. However, the smartwatches currently available in the market offer a limited number of applications. In this paper, we propose a new interaction technique named Harmonious Haptics, which provides users with enhanced tactile sensations by utilizing smartwatches as additional tactile displays for smartphones. When combined with typical mobile devices, our technique enables the design of a wide variety of tactile stimuli. To illustrate the potential of our approach, we developed a set of example applications that provide users with rich tactile feedback such as feeling textures in a graphical user interface, transferring a file between the tablet and the smartwatch device, and controlling UI components.
Lecture Notes in Electrical Engineering, 2017
This demo presents a mobile application using PhysVib: a software solution on the mobile platform extending an open-source physics engine for automatic vibrotactile feedback upon collision events in a multi-rate rendering architecture. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate an application showing three wall-object pairs with different material properties, and a user interacts with internal objects to feel vibrotactile feedback from collision events.
Proceedings of the SIGCHI conference on Human Factors in computing systems - CHI '06, 2006
Mobile interaction can potentially be enhanced with welldesigned haptic control and display. However, advances have been limited by a vicious cycle whereby inadequate haptic technology obstructs inception of vitalizing applications. We present the first stages of a systematic design effort to break that cycle, beginning with specific usage scenarios and a new handheld display platform based on lateral skin stretch. Results of a perceptual device characterization inform mappings between device capabilities and specific roles in mobile interaction, and the next step of hardware re-engineering.
Communications of the ACM, 2011
After more than 20 years of research and development, are haptic interfaces finally getting ready to enter the computing mainstream? E Ve R S I N Ce the first silentmode cell phones started buzzing in our pockets a few years ago, many of us have unwittingly developed a fumbling familiarity with haptics: technology that invokes our sense of touch. Video games now routinely employ force-feedback joysticks to jolt their players with a sense of impending onscreen doom, while more sophisticated haptic devices have helped doctors conduct surgeries from afar, allowed deskbound soldiers to operate robots in hazardous environments, and equipped musicians with virtual violins. Despite recent technological advances, haptic interfaces have made only modest inroads into the mass consumer market. Buzzing cell phones and shaking joysticks aside, developers have yet to create a breakthrough product-a device that would do for haptics what the iPhone has done for touch screens. The slow pace of market acceptance stems partly from typical new-technology growing pains: high production costs, the lack of standard application programming interfaces (APIs), and the absence of established user interface conventions. Those issues aside, however, a bigger question looms over this fledgling industry: What are haptics good for, exactly? Computer scientists have been exploring haptics for more than two decades. Early research focused largely on the problem of sensory substitution, converting imagery or speech information into electric or vibratory stimulation patterns on the skin. As the technology matured, haptics found new applications in teleoperator systems and virtual environments, useful for robotics and flight simulator applications. Today, some researchers think the big promise of haptics may involve
This chapter sets about to provide the background and orientation needed to set a novice designer on his or her way to bringing haptics successfully into an interactive product. To define appropriate roles for haptic interaction, it is necessary to integrate a basic awareness of human capabilities on one hand and current device technology on the other. Here, I explore this integration by first summarizing the most salient constraints imposed by both humans and hardware. I then proceed to relate perceptual, motor, and attentional capabilities to a selection of emerging application contexts chosen to be relevant to contemporary design trends and opportunities. These include abstract communication and notification, augmentation of graphical user interfaces, expressive control, affective communication, and mobile and handheld computing. Our touch (haptic) sense is such an integral part of our everyday experience that few of us really notice it. Notice it now, as you go about your business. Within and beneath our skin lie layers of ingenious and diverse tactile receptors comprising our tactile sensing subsystem. These receptors enable us to parse textures, assess temperature and material , guide dexterous manipulations, find a page's edge to turn it, and deduce a friend's mood from a touch of his hand. Intermingled with our muscle fibers and within our joints are load cells and position transducers making up our proprioceptive sense, which tell our nervous systems of a limb's position and motion and the resistance it encounters. Without these and their close integration with our body's motor control, it would be exceedingly difficult to break an egg neatly into a bowl, play a piano, walk without tripping, stroke a pet, write, draw, or even type. Touch is our earliest sense to develop (Montagu, 1986). It has evolved to work in a tight partnership with vision and hearing in many ways we are only beginning to understand, as we study processes (such as hand-eye coordination) and how we process conflicting or competing information from different senses. In stark contrast to the importance of touch in our everyday experience, the use of touch is marginalized in contemporary computer interfaces, overlooked in the rush to accommodate graphical capability in desktop-based systems. The primary advances have been in feel-focused improvements in nonactuated pointing tools for both function and aesthetics. Scroll wheels have been designed for the user to click with just the right resistance and frequency; and most cell phones now come with vibrators that indicate incoming calls. Meanwhile, the use of haptic feedback in the consumer sphere is largely limited to gaming, and tactile feedback to simple cell phone alerts.
We propose HoliBraille, a system that enables Braille input and output on current mobile devices. We use vibrotactile motors combined with dampening materials in order to actuate directly on users' fingers. The prototype can be attached to current capacitive touchscreen devices enabling multipoint and localized feedback. HoliBraille can be leveraged in several applications including educational tools for learning Braille, as a communication device for deaf-blind people, and as a tactile feedback system for multitouch Braille input. We conducted a user study with 12 blind participants on Braille character discrimination. Results show that HoliBraille is effective in providing localized feedback; however, character discrimination performance is strongly related with number of simultaneous stimuli. We finish by discussing the obtained results and propose future research avenues to improve multipoint vibrotactile perception.
We present HapTable; a multi-modal interactive tabletop that allows users to interact with digital images and objects through natural touch gestures, and receive visual and haptic feedback accordingly. In our system, hand pose is registered by an infrared camera and hand gestures are classified using a Support Vector Machine (SVM) classifier. To display a rich set of haptic effects for both static and dynamic gestures, we integrated electromechanical and electrostatic actuation techniques effectively on tabletop surface of HapTable, which is a surface capacitive touch screen. We attached four piezo patches to the edges of tabletop to display vibrotactile feedback for static gestures. For this purpose, the vibration response of the tabletop, in the form of frequency response functions (FRFs), was obtained by a laser Doppler vibrometer for 84 grid points on its surface. Using these FRFs, it is possible to display localized vibrotactile feedback on the surface for static gestures. For dynamic gestures, we utilize the electrostatic actuation technique to modulate the frictional forces between finger skin and tabletop surface by applying voltage to its conductive layer. To our knowledge, this hybrid haptic technology is one of a kind and has not been implemented or tested on a tabletop. It opens up new avenues for gesture-based haptic interaction not only on tabletop surfaces but also on touch surfaces used in mobile devices with potential applications in data visualization, user interfaces, games, entertainment, and education. Here, we present two examples of such applications, one for static and one for dynamic gestures, along with detailed user studies. In the first one, user detects the direction of a virtual flow, such as that of wind or water, by putting their hand on the tabletop surface and feeling a vibrotactile stimulus traveling underneath it. In the second example, user rotates a virtual knob on the tabletop surface to select an item from a menu while feeling the knob's detents and resistance to rotation in the form of frictional haptic feedback.
2012 International Conference on Cyberworlds, 2012
The simulation of tactile sensation using haptic devices is increasingly investigated in conjunction with simulation and training. In this paper we explore the most popular haptic frameworks and APIs. We provide a comprehensive review and comparison of their features and capabilities, from the perspective of the need to develop a haptic simulator for medical training purposes. In order to compare the studied frameworks and APIs, we identified and applied a set of 11 criteria and we obtained a classification of platforms, from the perspective of our project. According to this classification, we used the best platform to develop a visuohaptic prototype for liver diagnostics.
NIME 2022
ForceHost is an opensource toolchain for generating firmware that hosts authoring and rendering of force-feedback and audio signals and that communicates through I2C with guest motor and sensor boards. With ForceHost, the stability of audio and haptic loops is no longer delegated to and dependent on operating systems and drivers, and devices remain discoverable beyond planned obsolescence. We modified Faust, a highlevel language and compiler for real-time audio digital signal processing, to support haptics. Our toolchain compiles audio-haptic firmware applications with Faust and embeds web-based UIs exposing their parameters. We validate our toolchain by example applications and modifications of integrated development environments: script-based programming examples of haptic firmware applications with our haptic1D Faust library, visual programming by mapping input and output signals between audio and haptic devices in Webmapper, visual programming with physically-inspired massinteraction models in Synth-a-Modeler Designer. We distribute the documentation and source code of ForceHost and all of its components and forks.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.