Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2002, Proceedings of the First International Conference on …
In this paper,we introduce a new pointing device for Ubiquitous computing environment.The user's eye is an integral part of the system. This relatively simple system makes it possible to realize novel features such as the "tele-click".
2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, 2014
While the new generation of eyewear computers have increased expectations of a wearable computer, providing input to these devices is still challenging. Hand-held devices, voice commands, and hand gestures have already been explored to provide input to the wearable devices. In this paper, we examined using head and eye movements to point on a graphical user interface of a wearable computer. The performance of users in head and eye pointing has been compared with mouse pointing as a baseline method. The result of our experiment showed that the eye pointing is significantly faster than head or mouse pointing; however, our participants thought that the head pointing is more accurate and convenient.
International Journal for Research in Applied Science and Engineering Technology
Numerous individuals who have neuro-locomotor deficits or are disabled by injury are unable to use PCs for abecedarian tasks, for example, sending or receiving dispatches, scouring the web, watching their TV programme, or stirring filmland. In a previous study, eyes were set up to be a great candidate for ubiquitous computing since they always move when they engage with calculating gear. Using this underlying knowledge from eye movements, these instances might be renewed using computers. For this aim, we offer a mouse gesture control mechanism that can only be used by mortal eyes. The goal of this study is to provide an open-source generic eye-gesture control system that can efficiently capture eye movements and allow the stoner to do conduct counterplotted to specified eye movements gestures via computer camera. After identifying it on the stoner's face, it tracks the pupil's motions. It must be accurate in real time for the stoner to use it comfortably, just like any other everyday prejudice.
Customarily, human PC interface utilizes console, mouse as an info gadgets however this paper presents hand free interface amongst PC and human. Here giving a clever plan to control PC mouse cursor utilizing human eyes development. It controls mouse moving via naturally influencing the position where eyesight centers around and all the while mouse-click by influencing blinking activity. In this paper we depict Face detection and Eye tracking innovation with Algorithm of proposed framework. This innovation is extremely useful for taking care of the HMI issues of the crippled and furnishing them an approach to speak with the outside world, enhance their capacity of living and help them recapture certainty.
international journal of engineering technology and management sciences
The system described here presents hand-free interface between human and computer. It uses various image processing methods such as face detection, eye extraction, interpretation of sequence of eye blinks in real time for controlling a non- intrusive, human-computer interface. It uses a typical webcam to capture an input image. Mousecursor control can be done by facial movement by moving the face towards left and right, up and down, mouse events are controlled through eye blinks. A high number of people, affected with neuron locomotor disabilities can use this technology in computers for basic tasks such as sending or receiving messages, browsing the internet, watch their favorite TV shows or movies. This algorithm is used to give the best possible outcomes of the eye position using the decision tree algorithm so that the eye movement is detected and the mouse moves accordingly. It also enables the user to open and close the applications by blinking the eye.
3rd International Conference on Human System Interaction, 2010
We are currently witnessing an increasing attention to issues related to Accessibility, which should eliminate, or at least reduce, the distance between disabled people and technology. However, particularly for severelyimpaired persons, there are still many challenges that must be overcome. In this paper we present eye tracking as a valuable support for disability in the accomplishment of hands-free tasks. Moreover, we stress the potentials of eyebased interfaces to enhance the user-machine interaction process in "traditional" activities based on keyboard and mouse. Through the description of some of the projects we have recently developed at the University of Pavia, we will show how interfaces based on eye tracking can be really helpful in different contexts of use.
The Journal of The Institute of Image Information and Television Engineers, 2012
We propose a method combining eye-gaze detection and PupilMouse to use for communicators for severely physically handicapped people. Both methods are based on remote pupil detection with a near-infrared light source. We had already implemented those methods in similar systems. With PupilMouse, the cursor on the PC display screen is moved smoothly and accurately based on the pupil movements in the camera image caused by the user s head movement. In contrast, with the eye-gaze detection method, the cursor moves quickly by shifting the eye-gaze point on the screen and arrives at an inaccurate eye-gaze position. In the proposed method, to use the advantages of both methods, the method is switched in accordance with the speed of the eye-gaze shift. The experimental results show that the proposed method enables characters to be inputs more quickly, accurately, and comfortably than when using one of the two conventional methods.
2001
This paper argues for a joint development of an eye gaze -based, on-line communication aid running on a standard PC with a web -camera. Tracking software is to be provided as open source to allow for improvements and individual integrations with other aids. The interface design shall be defined by the achieved resolution of the tr acking system. The design of a type -to-talk system with 12 large on -screen keys is described in the paper. In order for gaze tracking systems to become widely used, the strive for mouse -pointer precision should be replaced by a focus on the broad potentials of low-resolution gaze-based interactive systems.
Lecture Notes in Computer Science, 2006
This paper presents a computer method to help people, typically having limited mobility, to be able to operate ICT devices with eye gaze in their living/work environment. The user's eye gaze is recorded and analyzed in realtime. Any ICT device in the environment that is being looked at for a certain time period is identified, located and assumed to be the object of interest that the user wants to utilise. Through a suitable interface, the user can then decide whether to operate the device. By using this state-of-the-art technology, people with impaired mobility, or able bodied people whose movements are restricted can attain a more independent life style.
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications - ETRA '10, 2010
Eye gaze interaction for disabled people is often dealt with by designing ad-hoc interfaces, in which the big size of their elements compensates for both the inaccuracy of eye trackers and the instability of the human eye. Unless solutions for reliable eye cursor control are employed, gaze pointing in ordinary graphical operating environments is a very difficult task. In this paper we present an eye-driven cursor for MS Windows which behaves differently according to the "context". When the user's gaze is perceived within the desktop or a folder, the cursor can be discretely shifted from one icon to another. Within an application window or where there are no icons, on the contrary, the cursor can be continuously and precisely moved. Shifts in the four directions (up, down, left, right) occur through dedicated buttons. To increase user awareness of the currently pointed spot on the screen while continuously moving the cursor, a replica of the spot is provided within the active direction button, resulting in improved pointing performance.
International Journal of Advanced Computer Science and Applications, 2018
A high number of people, affected with neurolocomotor disabilities or those paralyzed by injury cannot use computers for basic tasks such as sending or receiving messages, browsing the internet, watch their favorite TV show or movies. Through a previous research study, it was concluded that eyes are an excellent candidate for ubiquitous computing since they move anyway during interaction with computing machinery. Using this underlying information from eye movements could allow bringing the use of computers back to such patients. For this purpose, we propose an imouse gesture control system which is completely operated by human eyes only. The purpose of this work is to design an open-source generic eye-gesture control system that can effectively track eye-movements and enable the user to perform actions mapped to specific eye movements/gestures by using computer webcam. It detects the pupil from the user's face and then tracks its movements. It needs to be accurate in real-time so that the user is able to use it like other everyday devices with comfort.
Procedia Technology, 2014
This work describes an eye tracking system for a natural user interface, based only on non-intrusive devices such as a simple webcam. Through image processing the system is able to convert the focus of attention of the user to the corresponding point on the screen. Experimental tests were performed displaying to the users a set of known points on the screen. These tests show that the application has promising results.
Proceedings of the SIGCHI …, 2007
We present a practical technique for pointing and selection using a combination of eye gaze and keyboard triggers. EyePoint uses a two-step progressive refinement process fluidly stitched together in a look-press-look-release action, which makes it possible to compensate for the accuracy limitations of the current state-of-the-art eye gaze trackers. While research in gaze-based pointing has traditionally focused on disabled users, EyePoint makes gaze-based pointing effective and simple enough for even able-bodied users to use for their everyday computing tasks. As the cost of eye gaze tracking devices decreases, it will become possible for such gaze-based techniques to be used as a viable alternative for users who choose not to use a mouse depending on their abilities, tasks and preferences.
Human-Computer Interfaces (HCI) allow effective interactions between human being and computers, which is of particular significance to people with disabilities or temporary mobility impairment. In this paper, we propose the EyePhone framework, a mobile HCI that allows users to control mobile phones through intentional eye or facial movements. A proofof-concept prototype is developed based on a wearable electroencephalograph (EEG) headset and an Android smartphone. Specifically, a graphical window can receive and display continuous EEG data acquired from the headset; a mouse emulator can allow users to move a cursor around the smartphone screen by moving their heads and eyes; and an emergency dialer can allow users to make an emergency phone call through a certain pattern of eye/facial movements. The launching and switching of each functional module are also implemented through predefined head movement patterns, in order to achieve a true "hands-free" environment. The efficacy and efficiency of the proposed EyePhone system is evaluated based on experiments in a variety of scenarios (e.g., sitting, standing, and walking).
2020
This paper proposes a technique of using the movement of eyes to control the movement of cursor on monitor screens. Thereby, creating new ways of Human Computer Interaction (HCI) and also helping physically handicapped people to interact with computer devices more efficiently. Earlier eye gaze optical mouse comprised of a head gear which had an eye motion sensor attached and were more hardware based. The input gathered through these sensors helped in cursor movement on screen. With the advancement in the field of Image Processing Techniques and Artificial Intelligence, a simple web camera attached with computer can be used to perform this task. In this paper, pupil of the eye is detected. The coordinates gathered by tracking pupil movement are mapped with the coordinate of display monitor. Based on this mapping the mouse cursor can be moved on the screen.
2014
This paper presents an implementation of the “Eye Tracking Mouse”, the arrangements has been established to make the border between the disable person and the system. As we know that the computer provides various potential application such as monitoring the system but the person with the severe disability is not able to access the computer or the benefits that are provided by the computer. Aim of this system is that it tracks the computer operator’s activities with the help of the camera and translates them into the arrangements of the mouse cursor on the display. The different body parts like the tip of the user’s nose, head movements, eye movement , voice recognition are helpful to operate the system. The object of this paper is to present a set of methods integrated into a low-lost eye tracking system. Here we are going to study how we access the system with the help of "eye mouse". Eye mouse helps disable person to access the system. This method consist of various step...
2019
Eyewear displays allow users to interact with virtual content displayed over real-world vision, in active situations like standing and walking. Pointing techniques for eyewear displays have been proposed, but their social acceptability, efficiency, and situation awareness remain to be assessed. Using a novel street-walking simulator, we conducted an empirical study of target acquisition while standing and walking under different levels of street crowdedness. We evaluated three phone-based eyewear pointing techniques: indirect touch on a touchscreen, and two in-air techniques using relative device rotations around forward and a downward axes. Direct touch on a phone, without eyewear, was used as a control condition. Results showed that indirect touch was the most efficient and socially acceptable technique, and that in-air pointing was inefficient when walking. Interestingly, the eyewear displays did not im- prove situation awareness compared to the control condition. We discuss impl...
International Journal of Advanced Computer Science and Applications, 2011
Eye-based Human-Computer Interaction: HCI system which allows phoning, reading e-book/e-comic/e-learning, internet browsing, and TV information extraction is proposed for handicap student in E-Learning Application. The conventional eye-based HCI applications are facing problems on accuracy and process speed. We develop new interfaces for improving key-in accuracy and process speed of eye-based key-in for E-Learning application, in particular. We propose eye-based HCI by utilizing camera mounted glasses for gaze estimation. We use the sight for controlling the user interface such as navigation of e-comic/ebook/e-learning contents, phoning, internet browsing, and TV information extraction. We develop interfaces including standard interface navigator with five keys, single line of moving keyboard, and multi line of moving keyboard in order to allow the aforementioned functions without burdening the accuracy. The experimental results show the proposed system does work the aforementioned functions in a real time basis.
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2022
With recent advances in technology, modern computer systems are becoming more flexible. Modern computers are capable of processing millions of information per second. In such cases, traditional input devices such as a mouse or keyboard are relatively slow. In this paper we use system that can be overcome by human interaction with the computer. With innovation and development in technology, motion sensors are able to capture the position and natural movements of the human body. This has made possible a new way of communication with computers. So keeping all these in mind we propose a system which is an untouched and fast communication system. This system will be able to capture the movements of the eyeball for which it is responsible cursor control. The system processes the data in the camera feed and calibrates the parameter interface according to the user. The system then performs computer-related algorithms to determine the location of the doll's and use eyes to implement natural eye-computer interactions.
— Eye tracking as an interface to operate a computer is under research for a while and new systems are still being developed nowadays that provide some encouragement to those bound to illnesses that incapacitates them to use any other form of interaction with a computer. Although using computer vision processing and a camera, these systems are usually based on head mount technology being considered a contact type system. This paper describes the implementation of a human-computer interface based on a fully non-contact eye tracking vision system in order to allow people with tetraplegia to interface with a computer. As an assistive technology, a graphical user interface with special features was developed including a virtual keyboard to allow user communication, fast access to pre-stored phrases and multimedia and even internet browsing. This system was developed with the focus on low cost, user friendly functionality and user independency and autonomy.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.