Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Erica is a computer workstation with a unique user interface. The workstation is equipped with imaging hardware and software, which automatically record a digital portrait of the user's eye. From the features of the current portrait, the interface calculates the approximate location of the user's eye-gaze on the computer screen. The computer then executes commands associated with the menu option currently displayed at this screen location. In this way, the user can interact with the computer, run applications software, and manage peripheral devices-all simply by looking at an appropriate sequence of menu options displayed on the screen. The eye-gaze interface technology, its implementation in Erica, and its application as a prosthetic device are described.
— Eye tracking as an interface to operate a computer is under research for a while and new systems are still being developed nowadays that provide some encouragement to those bound to illnesses that incapacitates them to use any other form of interaction with a computer. Although using computer vision processing and a camera, these systems are usually based on head mount technology being considered a contact type system. This paper describes the implementation of a human-computer interface based on a fully non-contact eye tracking vision system in order to allow people with tetraplegia to interface with a computer. As an assistive technology, a graphical user interface with special features was developed including a virtual keyboard to allow user communication, fast access to pre-stored phrases and multimedia and even internet browsing. This system was developed with the focus on low cost, user friendly functionality and user independency and autonomy.
Procedia Technology, 2014
This work describes an eye tracking system for a natural user interface, based only on non-intrusive devices such as a simple webcam. Through image processing the system is able to convert the focus of attention of the user to the corresponding point on the screen. Experimental tests were performed displaying to the users a set of known points on the screen. These tests show that the application has promising results.
3rd International Conference on Human System Interaction, 2010
We are currently witnessing an increasing attention to issues related to Accessibility, which should eliminate, or at least reduce, the distance between disabled people and technology. However, particularly for severelyimpaired persons, there are still many challenges that must be overcome. In this paper we present eye tracking as a valuable support for disability in the accomplishment of hands-free tasks. Moreover, we stress the potentials of eyebased interfaces to enhance the user-machine interaction process in "traditional" activities based on keyboard and mouse. Through the description of some of the projects we have recently developed at the University of Pavia, we will show how interfaces based on eye tracking can be really helpful in different contexts of use.
2012
The goal of the Gaze Controlled Human Computer Interface project is to design and construct a non-invasive gaze-tracking system that will determine where a user is looking on a computer screen in real time. To accomplish this, a fixed illumination source consisting of Infrared (IR) Light Emitting Diodes (LEDs) is used to produce corneal reflections on the user’s eyes. These reflections are captured with a video camera and compared to the relative location of the user’s pupils. From this comparison, a correlation matrix can be created and the approximate location of the screen that the user is looking at can be determined. The final objective is to allow the user to manipulate a cursor on the computer screen simply by looking at different boxes in a grid on the monitor. The project includes design of the hardware setup to provide a suitable environment for glint detection, image processing of the user’s eyes to determine pupil location, the implementation of a probabilistic algorithm...
2012
Abstract—Innovative systems for user-computer interaction based on the user's eye-gaze behavior have important implications for various applications. Examples include user navigation in large images, typical of astronomy or medicine, and user selection and viewing of multiple video streams. Typically, a web environment is used for these applications. System latency must be negligible, while system obtrusiveness must be small. This paper describes the implementation and initial experimentation on such an innovative system.
2001
This paper argues for a joint development of an eye gaze -based, on-line communication aid running on a standard PC with a web -camera. Tracking software is to be provided as open source to allow for improvements and individual integrations with other aids. The interface design shall be defined by the achieved resolution of the tr acking system. The design of a type -to-talk system with 12 large on -screen keys is described in the paper. In order for gaze tracking systems to become widely used, the strive for mouse -pointer precision should be replaced by a focus on the broad potentials of low-resolution gaze-based interactive systems.
New graphical user interfaces (GUIs) are designed which led challenge to eye tracker to control WIMP and replace many works of another input device. An eye tracking system offers possibilities for independent use and improved quality of life via dedicated interface tools especially tailored to the users’ with special needs (e.g., interaction, communication, e-mailing, web browsing and entertainment). An experimental investigation of an eye tracker as a computer input device are presented to compare user performance in eye gaze and standard mouse. Ten circles in blank background and complex background screen are put in random position to be selected by participant. The results show that an eye tracker can be used as a fast selection device which the target size is not too small. Eye gaze typing compare with keyboard typing test also shown the decline of speed type and the rapid increase of errors. To enhance its functionality, the study found that keyboard is the best candidate to combine with eye gaze by its capability to complete all task in WIMP interaction.
Computers in Biology and Medicine, 2007
Human-computer interactions (HCI) have become an important area of research and development in computer science and psychology. Appropriate use of computers could be of primary importance for communication and education of those subjects which could not move, speak, see or hear properly. The aim of our study was to develop a reliable, low-cost and easy-to-use HCI based on electrooculography signal analysis, to allow physically impaired patients to control a computer as assisted communication. Twenty healthy subjects served as volunteers: eye movements were captured by means of four electrodes and a two-channel amplifier. The output signal was then transmitted to an "Analog to Digital" (AD) converter, which digitized the signal of the amplifier at a rate of 500 Hz, before being sent to a laptop. We designed and coded a specific software, which analyzed the input signal to give an interpretation of eye movements. By means of a single ocular movement (up, down, left and right) the subjects were then able to move a cursor over a screen keyboard, passing from one letter to another; a double eye blink was then necessary to select and write the active letter. After a brief training session, all the subjects were able to confidently control the cursor and write words using only ocular movements and blinking. For each subject we presented three series of randomized words: mean time required to enter a single character was about 8.5 s, while input errors were very limited (less than 1 per 250 characters).
International Journal of Psychophysiology, 1998
The control of computer functions by eye movements was demonstrated in 14 normal volunteers. Electrical Ž . potentials recorded by horizontal and vertical electrooculography EOG were transformed into a cursor that represented a moving fixation point on a computer display. Subjects were able to spell words and sentences by using eye movements to place the cursor on target letters in the display of an alphabet matrix. The successful demonstration of computer-controlled syntactic construction by eye movements offers a potentially useful technique for computer-assisted communication in special groups, such as developmentally-disabled individuals who have motor paralysis and who cannot speak. ᮊ
Image and Vision Computing, 1995
Computer vision has a signicant role to play in the human-computer interaction (HCI) devices of the future. All computer input devices serve one essential purpose. They transduce some motion or energy from a human agent into machine useable signals. One may therefore think of input devices as the`perceptual organs' by which computers sense the intents of their human users. We outline the role computer vision will play, highlight the impediments to the development of vision-based interfaces, and propose an approach for overcoming these impediments. Prospective vision research areas for HCI include human face recognition, facial expression interpretation, lip reading, head orientation detection, eye gaze tracking, three-dimensional nger pointing, hand tracking, hand gesture interpretation, and body pose tracking.
2014
This paper presents an implementation of the “Eye Tracking Mouse”, the arrangements has been established to make the border between the disable person and the system. As we know that the computer provides various potential application such as monitoring the system but the person with the severe disability is not able to access the computer or the benefits that are provided by the computer. Aim of this system is that it tracks the computer operator’s activities with the help of the camera and translates them into the arrangements of the mouse cursor on the display. The different body parts like the tip of the user’s nose, head movements, eye movement , voice recognition are helpful to operate the system. The object of this paper is to present a set of methods integrated into a low-lost eye tracking system. Here we are going to study how we access the system with the help of "eye mouse". Eye mouse helps disable person to access the system. This method consist of various step...
Lecture Notes in Computer Science, 2006
This paper presents a computer method to help people, typically having limited mobility, to be able to operate ICT devices with eye gaze in their living/work environment. The user's eye gaze is recorded and analyzed in realtime. Any ICT device in the environment that is being looked at for a certain time period is identified, located and assumed to be the object of interest that the user wants to utilise. Through a suitable interface, the user can then decide whether to operate the device. By using this state-of-the-art technology, people with impaired mobility, or able bodied people whose movements are restricted can attain a more independent life style.
With the evolution of Eye Tracking from a concept to reality, it is being explored scientifically these days in Human Computer Interaction in order to record the eye movements to determine the gaze direction, position of a user on the screen at a given time and the sequence of their movement. The threefold objective of this paper include introducing the reader to the key aspects and issues of eye-movement technology, practical guidance for developing an Eye tracking application, and various opportunities and underlying challenges to develop (Man and Machine Interfacing) MAMI systems using Eye tracking. We have uniquely integrated The Eye Tribe with Unity5.1.1 and through an experiment, we have also inferred that a subject with and without bifocal glasses show relatively similar fixation results if they have correct vision but the results differ with small error if the eye is corrected using lenses. Another experiment using Eye Tribe shows that gaze input requires less time as compared to the mouse input.
The paper presents a novel idea to control computer mouse cursor movement with human eyes. In this paper, a working of the product has been described as to how it helps the special people share their knowledge with the world. Number of traditional techniques such as Head and Eye Movement Tracking Systems etc. exist for cursor control by making use of image processing in which light is the primary source. Electro-oculography (EOG) is a new technology to sense eye signals with which the mouse cursor can be controlled. The signals captured using sensors, are first amplified, then noise is removed and then digitized, before being transferred to PC for software interfacing.
IEEE Consumer Electronics Magazine, 2015
Proceedings of SPIE, 2003
We investigate the visual and vocal modalities of interaction with computer systems. We focus our attention on the integration of visual and vocal interface as possible replacement and/or additional modalities to enhance humancomputer interaction. We present a new framework for employing eye gaze as a modality of interface. While voice commands, as means of interaction with computers, have been around for a number of years, integration of both the vocal interface and the visual interface, in terms of detecting user's eye movements through an eye-tracking device, is novel and promises to open the horizons for new applications where a hand-mouse interface provides little or no apparent support to the task to be accomplished. We present an array of applications to illustrate the new framework and eye-voice integration.
2007
The eyes are a rich source of information for gathering context in our everyday lives. A user's gaze is postulated to be the best proxy for attention or intention. Using gaze information as a form of input can enable a computer system to gain more contextual information about the user's task, which in turn can be leveraged to design interfaces which are more intuitive and intelligent. Eye gaze tracking as a form of input was primarily developed for users who are unable to make normal use of a keyboard and pointing device. However, with the increasing accuracy and decreasing cost of eye gaze tracking systems it will soon be practical for able-bodied users to use gaze as a form of input in addition to keyboard and mouse.
International Journal of Advanced Computer Science and Applications, 2011
Eye-based Human-Computer Interaction: HCI system which allows phoning, reading e-book/e-comic/e-learning, internet browsing, and TV information extraction is proposed for handicap student in E-Learning Application. The conventional eye-based HCI applications are facing problems on accuracy and process speed. We develop new interfaces for improving key-in accuracy and process speed of eye-based key-in for E-Learning application, in particular. We propose eye-based HCI by utilizing camera mounted glasses for gaze estimation. We use the sight for controlling the user interface such as navigation of e-comic/ebook/e-learning contents, phoning, internet browsing, and TV information extraction. We develop interfaces including standard interface navigator with five keys, single line of moving keyboard, and multi line of moving keyboard in order to allow the aforementioned functions without burdening the accuracy. The experimental results show the proposed system does work the aforementioned functions in a real time basis.
User-computer dialogues are typically one-sided, with the bandwidth from computer to user far greater than that from user to computer. The movement of a user's eyes can provide a convenient, natural, and high-bandwidth source of additional user input, to help redress this imbalance. We therefore investigate the introduction of eye movements as a computer input medium. Our emphasis is on the study of interaction techniques that incorporate eye movements into the user-computer dialogue in a convenient and natural way. This chapter describes research at NRL on developing such interaction techniques and the broader issues raised by non-command-based interaction styles. It discusses some of the human factors and technical considerations that arise in trying to use eye movements as an input medium, describes our approach and the first eye movement-based interaction techniques that we have devised and implemented in our laboratory, reports our experiences and observations on them, and considers eye movement-based interaction as an exemplar of a new, more general class of non-command-based user-computer interaction.
2015
With the rapid evolution in computer technology, there’s an augmented need to eminently dedicate attention to the computer-aided interaction, including crucial design aspects, implementing and evaluating the interfaces that provide this type of communication. Various techniques for human-computer interaction have been used, commencing with keyboards, printers, moving on with gesture interaction, speech interaction, touch screens, eye gaze tracking and many more. Most of these techniques are still analyzed and examined if they could ensure an ease at performing given tasks, such as moving the mouse cursor, selecting menus, moving or dragging objects on the computer screen, thus helping users with disadvantages to interact with the workstation. This paper describes usability study of an existing eye-tracking system and evaluate its correctness and calculate its error percentage, which can lead us to the deduction if this system would present an efficient interaction and provide facili...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.