Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2001
This paper argues for a joint development of an eye gaze -based, on-line communication aid running on a standard PC with a web -camera. Tracking software is to be provided as open source to allow for improvements and individual integrations with other aids. The interface design shall be defined by the achieved resolution of the tr acking system. The design of a type -to-talk system with 12 large on -screen keys is described in the paper. In order for gaze tracking systems to become widely used, the strive for mouse -pointer precision should be replaced by a focus on the broad potentials of low-resolution gaze-based interactive systems.
Procedia Technology, 2014
This work describes an eye tracking system for a natural user interface, based only on non-intrusive devices such as a simple webcam. Through image processing the system is able to convert the focus of attention of the user to the corresponding point on the screen. Experimental tests were performed displaying to the users a set of known points on the screen. These tests show that the application has promising results.
2009
Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, webcam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems.
2001
Eye gaze interaction can provide a convenient and natural addition to user-computer dialogues. We have previously reported on our interaction techniques using eye gaze . In this paper, we assess the effectiveness of our approach in a controlled setting. We present two experiments that compare our eye gaze object selection technique with conventional selection by mouse. The results show that, for a simple task, it takes 60% less time to select an object with our eye gaze technique than with a mouse. We also use Fitts' Law to investigate the speed and quality differences between eye gaze selection and mouse selection. Our eye gaze selection technique is shown to produce a low slope, more like pure eye movement, which suggests that the technique preserves the inherent speed advantage of the eye over the hand. We find that eye gaze interaction is at least as fast as the mouse; and it is convenient in situations where it is important to use the hands for other tasks. It is particularly beneficial for the larger screens, workspaces, and virtual environments of the future; and will become increasingly practical as eye tracker technology matures.
Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA '09, 2009
Eye movements are the only means of communication for some severely disabled people. However, the high prices of commercial eye tracking systems limit the access to this technology. In this pilot study we compare the performance of a low-cost, webcam-based gaze tracker that we have developed with two commercial trackers in two different tasks: target acquisition and eye typing. From analyses on throughput, words per minute and error rates we conclude that a low-cost solution can be as efficient as expensive commercial systems.
Procedia Technology, 2012
The human gaze is a basic mean for non verbal interaction. However, in several situations, especially in the context of upper limb motor impairment, the gaze represents also an alternative mean for human interaction with the environment (real or virtual). This interaction can be mastered through specific tools and new learned skills. Therefore the technological tool is a key for new interaction models. This paper presents a tool for gaze interaction: a new gaze tracker. The system specifications and the status of the gaze tracker design are presented; the dedicated algorithm for eye detection and tracking as well as an improvement of Zelinsky's model for eye movement prediction during the search of a predefined object in an image are outlined. Results of the first pre-prototype first evaluation with end users are summarized.
2011
The development of cheaper eye trackers and open source software for eye tracking and gaze interaction brings the possibility to integrate eye tracking into everyday use devices as well as highly specialized equipment. Apart from providing means for analyzing eye movements, eye tracking also offers the possibility of a natural user interaction modality. Gaze control interfaces are already used within assistive applications for disabled users. However, this novel user interaction possibility comes with its own set of limitations and challenges. The aim of this SIG is to provide a forum for Designers, Researchers and Usability Professionals to discuss the role of eye tracking as a user interaction method in the future as well as the technical and user interaction challenges that using eye tracking as an interaction method brings.
2007
The eyes are a rich source of information for gathering context in our everyday lives. A user's gaze is postulated to be the best proxy for attention or intention. Using gaze information as a form of input can enable a computer system to gain more contextual information about the user's task, which in turn can be leveraged to design interfaces which are more intuitive and intelligent. Eye gaze tracking as a form of input was primarily developed for users who are unable to make normal use of a keyboard and pointing device. However, with the increasing accuracy and decreasing cost of eye gaze tracking systems it will soon be practical for able-bodied users to use gaze as a form of input in addition to keyboard and mouse.
Erica is a computer workstation with a unique user interface. The workstation is equipped with imaging hardware and software, which automatically record a digital portrait of the user's eye. From the features of the current portrait, the interface calculates the approximate location of the user's eye-gaze on the computer screen. The computer then executes commands associated with the menu option currently displayed at this screen location. In this way, the user can interact with the computer, run applications software, and manage peripheral devices-all simply by looking at an appropriate sequence of menu options displayed on the screen. The eye-gaze interface technology, its implementation in Erica, and its application as a prosthetic device are described.
Proceedings of the …, 2008
Using gaze information as a form of input poses challenges based on the nature of eye movements and how we humans use our eyes in conjunction with other motor actions. In this paper, we present three techniques for improving the use of gaze as a form of input. We first present a saccade detection and smoothing algorithm that works on real-time streaming gaze information. We then present a study which explores some of the timing issues of using gaze in conjunction with a trigger (key press or other motor action) and propose a solution for resolving these issues. Finally, we present the concept of Focus Points, which makes it easier for users to focus their gaze when using gaze-based interaction techniques. Though these techniques were developed for improving the performance of gaze-based pointing, their use is applicable in general to using gaze as a practical form of input.
2003
We introduce a novel gaze-tracking system called FreeGaze, which is designed to support gaze-based human-computer interaction (HCI). Existing gaze-tracking systems require complicated and burdensome calibration, which prevents gaze being used to operate computers. To simplify the calibration process, FreeGaze corrects the refraction at the surface of the cornea. Unlike existing systems, this calibration procedure requires each user to look at only two points on the display. After the initial calibration, our system needs no further calibration for later measurement sessions. A user study shows that its gaze detection accuracy is about 1.06° (view angle), which is sufficient for gaze-based HCI. Gaze Tracking System for Gaze-Based Human-Computer Interaction † NTT Communication Science Laboratories Atsugi-shi, 243-0198 Japan E-mail: takehiko@brl.ntt.co.jp
arXiv: Human-Computer Interaction, 2019
MOTIVATION 2. EYETRACKING GAZE INTERACTIONS 3. EXPLANATION OF DESIGN CHOICES 4. GAZE INTERACTION MARKUP LANGUAGE Resources Region and its states Scene navigation Varying the stimulus: lists and randomization Events 5. THE GIML INTERPRETER 6. EVALUATION Usability study procedure Results and discussion 7. EXAMPLES OF USE Cognitive experiments on language acquisition in infants Communication boards 8. SUMMARY AND FUTURE PLANS 2. EYETRACKING AND GAZE INTERACTION The common saying "eyes are the only moveable part of the brain" clearly expresses the idea of gaze interaction as a method of communication used both in natural social interactions, starting from infant-parent communication (Fogel, Nwokah, Hsu, Dedo, & Walker, 1993), and by people with limited capacities for using common communication methods, e.g. those with physical disabilities, cerebral palsy (Galante & Menezes, 2012), as well as young children (Borgestig et al., 2017). We often communicate our intentions using eye movements, e.g. by gazing at a person we want to talk to in a group of people (diaxis) (Smith, Vertegaal, & Sohn, 2005). We look before we act, sometimes without a conscious thought. Our behavior is also guided by gaze (Hayhoe & Ballard, 2014). Various low-tech systems are available for people with communication difficulties (see: Zupan & Jenko, 2012), e.g. communication boards (Zawadzka & Murawski 2015). Computer-based gaze interaction systems, which use eye tracking technology, are the main tools for augmentative and alternative communication (AAC) technology (Clarke & Bloch, 2013) and allow patients to make autonomous decisions when and how to express their needs and begin to communicate with others (for review see: Nerișanu, Nerișanu, Maniu, & Neamțu, 2017). People can write messages with their eyes using a number of available applications (
Journal of Machine Vision and Applications, 2019
As modern assistive technology advances, eye-based text entry systems have been developed to help a subset of physically challenged people to improve their communication ability. However, speed of text entry in early eye-typing system tends to be relatively slow due to dwell time. Recently, dwell-free methods have been proposed which outperform the dwell-based systems in terms of speed and resilience, but the extra eye-tracking device is still an indispensable equipment. In this article, we propose a prototype of eye-typing system using an off-the-shelf webcam without the extra eye tracker, in which the appearance-based method is proposed to estimate people's gaze coordinates on the screen based on the frontal face images captured by the webcam. We also investigate some critical issues of the appearance-based method, which helps to improve the estimation accuracy and reduce computing complexity in practice. The performance evaluation shows that eye typing with webcam using the proposed method is comparable to the eye tracker under a small degree of head movement.
2015
With the rapid evolution in computer technology, there’s an augmented need to eminently dedicate attention to the computer-aided interaction, including crucial design aspects, implementing and evaluating the interfaces that provide this type of communication. Various techniques for human-computer interaction have been used, commencing with keyboards, printers, moving on with gesture interaction, speech interaction, touch screens, eye gaze tracking and many more. Most of these techniques are still analyzed and examined if they could ensure an ease at performing given tasks, such as moving the mouse cursor, selecting menus, moving or dragging objects on the computer screen, thus helping users with disadvantages to interact with the workstation. This paper describes usability study of an existing eye-tracking system and evaluate its correctness and calculate its error percentage, which can lead us to the deduction if this system would present an efficient interaction and provide facili...
2010
This paper presents a low-cost gaze tracking system that is based on a webcam mounted close to the user's eye. The performance of the gaze tracker was evaluated in an eye-typing task using two different typing applications. Participants could type between 3.56 and 6.78 words per minute, depending on the typing system used. A pilot study to assess the usability of the system was also carried out in the home of a user with severe motor impairments. The user successfully typed on a wall-projected interface using his eye movements.
3rd International Conference on Human System Interaction, 2010
We are currently witnessing an increasing attention to issues related to Accessibility, which should eliminate, or at least reduce, the distance between disabled people and technology. However, particularly for severelyimpaired persons, there are still many challenges that must be overcome. In this paper we present eye tracking as a valuable support for disability in the accomplishment of hands-free tasks. Moreover, we stress the potentials of eyebased interfaces to enhance the user-machine interaction process in "traditional" activities based on keyboard and mouse. Through the description of some of the projects we have recently developed at the University of Pavia, we will show how interfaces based on eye tracking can be really helpful in different contexts of use.
Lecture Notes in Computer Science, 2007
This paper investigates novel ways to direct computers by eye gaze. Instead of using fixations and dwell times, this work focuses on eye motion, in particular gaze gestures. Gaze gestures are insensitive to accuracy problems and immune against calibration shift. A user study indicates that users are able to perform complex gaze gestures intentionally and investigates which gestures occur unintentionally during normal interaction with the computer. Further experiments show how gaze gestures can be integrated into working with standard desktop applications and controlling media devices.
This paper introduces a new gaze-based Graphic User Interface (GUI) for Augmentative and Alternative Communication (AAC). In the state of the art, prediction methods to accelerate the production of textual, iconic and pictorial communication only by gaze control are still needed. The proposed GUI translates gaze inputs into words, phrases or symbols by the following methods and techniques: (i) a gaze-based information visualization technique, (ii) a prediction technique combining concurrent and retrospective methods, and (iii) an alternative prediction method based either on the recognition or morphing of spatial features. The system is designed for extending the communication function of individuals with severe motor disabilities, with the aim to allow end-users to independently hold a conversation without needing a human interpreter.
2012
Considering the increasing diversity of display arrangements including wall-sized screens and multidisplay setups, our eye gaze provides a particular high potential for implicit and seamless, as well as fast interactions. However, gaze-based interaction is often regarded as error-prone and unnatural, especially when restricting the input to gaze as a single modality. For this reason, we have developed several interaction techniques benefitting from gaze as an additional, implicit and fast pointing modality for roughly indicating a user’s visual attention in combination with common smartphones to make more explicit and precise specifications. In our demos, we showcase two examples for more natural and yet effective ways of incorporating a user’s gaze as a supporting input modality. The two application scenarios comprise (1) gaze-supported pan-and-zoom techniques using the example of GoogleEarth and (2) gaze-supported navigation and target selection in a virtual 3D scene.
2012
The goal of the Gaze Controlled Human Computer Interface project is to design and construct a non-invasive gaze-tracking system that will determine where a user is looking on a computer screen in real time. To accomplish this, a fixed illumination source consisting of Infrared (IR) Light Emitting Diodes (LEDs) is used to produce corneal reflections on the user’s eyes. These reflections are captured with a video camera and compared to the relative location of the user’s pupils. From this comparison, a correlation matrix can be created and the approximate location of the screen that the user is looking at can be determined. The final objective is to allow the user to manipulate a cursor on the computer screen simply by looking at different boxes in a grid on the monitor. The project includes design of the hardware setup to provide a suitable environment for glint detection, image processing of the user’s eyes to determine pupil location, the implementation of a probabilistic algorithm...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.