Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
International Journal of Advanced Computer Science and Applications
…
6 pages
1 file
The inability to control the limbs is the main reason that affects the daily activities of the disabled which causes social restrictions and isolation. More studies were performed to help disabilities for easy communication with the outside world and others. Various techniques are designed to help the disabled in carrying out daily activities easily. Among these technologies is the Smart Wheelchair. This research aims to develop a smart eye-controlled wheelchair whose movement depends on eye movement tracking. The proposed Wheelchair is simple in design and easy to use with low cost compared with previous Wheelchairs. The eye movement was detected through a camera fixed on the chair. The user's gaze direction is obtained from the captured image after some processing and analysis. The order is sent to the Arduino Uno board which controls the wheelchair movement. The Wheelchair performance was checked using different volunteers and its accuracy reached 94.4% with a very short response time compared with the other existing chairs.
international journal for research in applied science and engineering technology ijraset, 2020
The Eye Directive wheelchair may be a mobility-monitored device for persons with moderate/severe physical disabilities or chronic diseases also as for the elderly. There are various interfaces for wheelchair available within the market, still they continue to be under-utilized, the rationale being the power, power and mind presence required to work them. The proposed model is a possible alternative. In this model, we use the optical-type eye tracking system to control powered wheelchair. User's eye movements are translated to screen position using the optical type eye tracking system, with none direct contact. When user looks at convenient angle, then computer input system will send command to the software supported the angle of rotation of pupil i.e., when user moves his eyes balls left (move left), right (move right), straight (move forward) altogether other cases wheel chair will stop. Also, obstacle detection sensors are connected to the Arduino to supply necessary feedback for correct operation of the wheelchair and to make sure the user's safety. The motors appended to the wheelchair support differential steering which avoids clumsy motion.
An electric wheelchair is an aid for disabled people who have lost the ability to move. A conventional wheelchair is manually driven which cannot be used by full body impaired people, so a model is needed which can be beneficial to them. There are various motor operated wheelchairs available but none of them are perfectly accurate. Inaccuracy of wheelchair can have disastrous results for the operator. So, a design is required to navigate wheelchair which has high accuracy. This paper deals with deriving a method to navigate the wheelchair with high accuracy by collaborating E.O.G. method and interfacing a camera in front of eye to get maximum accuracy and giving highest priority to the safety of user.
In this paper, we use the optical-type eye tracking system to control powered wheelchair. The userís eye movements are translated to screen position using the optical-type eye tracking system. The pupil-tracking goggles with a video CCD camera and a frame grabber analyzes a series of human pupil images when the user is gazing at the screen. A new calibration algorithm is then used to determine the direction of the eye gaze in real time. We design an interface with nine command zones to control powered wheelchair. The command at the calculated position of the gazed screen is then sent to move the powered wheelchair.
IAEME PUBLICATION, 2020
According to the latest report prepared by the World Health Organization and the World Bank, 15 percent of the world's population is disabled [2]. The use of power-driven wheelchairs with high navigational intelligence is a great step to integrate severely handicapped and mentally ill people. Different systems are being developed, allowing the end-user to perform safe movements and accomplish some daily life important tasks [4]. The notion is to create an Eye Monitored System that allows wheelchair navigation depending on the movement of the eyes. We have built a device where a patient sits in a Wheel Chair looking directly at the camera; can move in a direction just by looking in that direction. Our Robotic wheelchair uses the image processing system to control the wheelchair. The user's eye movements are turned to a screen position using a camera, without any direct contact. In addition, we can give more independence to the disabled person to communicate with the devices in a room, for example: a light, a fan. This communication is done using a MEMS Accelerometer. Using this, the person can handle various devices easily.
2023 International Conference on Applied Intelligence and Sustainable Computing (ICAISC), 2023
This article proposes a system that aids people with disabilities. An Electric Eye Controlled Wheelchair System is built to help disabled people. With the designed system, disabled people can move effortlessly without support from others. The system uses image acquisition wherein the image of the eye is processed to find out the gaze direction of the eye using Haar cascade and gaze estimation algorithms and hence wheelchair moves according to the direction of eyeball movement. The gaze estimation algorithm is so precise and one single algorithm does the job of what two algorithms (Canny Edge detection, Hough Transform) are supposed to do and to execute the same task. With this technique, a disabled person can steer their wheelchair with their eye movement. The webcam is placed in Infront of the person which captures the live movements, and an image processing technique is used to track the position of the pupil in both eyes with the help of a raspberry pi processor. The image processing technique used here is Gaze tracking which uses Open CV. The gaze tracking tracks pupil movement and depicts its coordinates. According to pupil motion, the motor driver will be instructed to go forward, left, and right. A blink instruction is used to stop the wheelchair when the person blinks. Additionally, a front-mounted ultrasonic sensor that can detect obstructions and automatically halt wheelchair movement is mounted for safety reasons. The system is monitored by a Raspberry Pi device, which lowers the cost.
2018
People suffering from quadriplegia are unable to use both their hands and their legs. In such a scenario, they are dependent on others to move them around which results in a loss in their self-confidence. The only movements they are able to achieve are their heads and therefore their eyes. This paper leverages this movement of the eye and implements a method to track the movement of the eye to automatically control a wheelchair. A vision based system is utilized here, wherein the web-camera of the laptop is utilized to acquire images of the patient. By implementing the Viola Jones algorithm, the eyes of the patient are detected. Using MATLAB, these images undergo various morphological processes and on further analysis eye movements are tracked to determine in which direction the wheelchair is to be moved. These signals are then sent to the Arduino which forwards it on to the DC motors via the L293D IC.
2015
Statistics suggests that there are around 40 cases per million of quadriplegia every year. Great people like Stephen Hawking have been suffering from this phenomenon. Our project attempts to make lives of the people suffering from this phenomenon simple by helping them move around on their own and not being a burden on others. The idea is to create an Eye Controlled System which enables the movement of the patient’s wheelchair depending on the movements of eyeball. A person suffering from quadriplegia can move his eyes and partially tilt his head, thus giving is an opportunity for detecting these movements. There are various kinds of interfaces developed for powered wheelchair and also there are various new techniques invented but these are costly and not affordable to the poor and needy people. In this paper, we have proposed the simpler and cost effective method of developing wheelchair. We have created a system wherein a person sitting on this automated Wheel Chair with a camera ...
This work presents an intuitive and customizable assistive technology based on eye gaze, which is an integration of a previous multimodal assistive domotics system developed to the UFES robotic wheelchair. Users with motor disabilities are able to control home devices, communicate with family or caregiver through short phrases, and navigate a wheelchair by means of eye gaze. The interface is easy to use, in which there is a computer and a screen monitor on board the wheelchair, displaying some options for the user to select. This selection is made using an eye tracker. Experimental results with volunteers showed good performance in terms of the system usability. The main goal of this system is to improve life quality for users, providing augmented and alternative communication, mobility assistance and enhancement on activities of daily living.
V Congresso Brasileiro de Eletromiografia e Cinesiologia e o X Simpósio de Engenharia Biomédica (COBEC-SEB 2017), 2017
This work presents an intuitive and customizable assistive technology based on eye gaze, which is an integration of a previous multimodal assistive domotics system developed to the UFES robotic wheelchair. Users with motor disabilities are able to control home devices, communicate with family or caregiver through short phrases, and navigate a wheelchair by means of eye gaze. The interface is easy to use, in which there is a computer and a screen monitor on board the wheelchair, displaying some options for the user to select. This selection is made using an eye tracker. Experimental results with volunteers showed good performance in terms of the system usability. The main goal of this system is to improve life quality for users, providing augmented and alternative communication, mobility assistance and enhancement on activities of daily living.
Multimodal Technologies and Interaction, 2021
This paper presents a practical human-computer interaction system for wheelchair motion through eye tracking and eye blink detection. In this system, the pupil in the eye image has been extracted after binarization, and the center of the pupil was localized to capture the trajectory of eye movement and determine the direction of eye gaze. Meanwhile, convolutional neural networks for feature extraction and classification of open-eye and closed-eye images have been built, and machine learning was performed by extracting features from multiple individual images of open-eye and closed-eye states for input to the system. As an application of this human-computer interaction control system, experimental validation was carried out on a modified wheelchair and the proposed method proved to be effective and reliable based on the experimental results.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
International Journal of Engineering Sciences & Research Technology, 2013
International Journal of Advance Research Ideas and Innovations in Technology
Journal of emerging technologies and innovative research, 2019
International Journal of Intelligent Engineering and Systems
Artificial Life and Robotics, 2009
International Journal for Scientific Research and Development, 2016
International Journal of Advanced Research in Electrical, Electronics and Instrumentation Energy, 2015
IAEME PUBLICATION, 2016
International Journal of Information …, 2010
Journal of Innovative Image Processing, 2021
International Journal for Research in Applied Science and Engineering Technology
International Journal of Intelligent Engineering and Systems
Technologies, 2021
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2022