Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2012, Proceedings of the 4th International Conference on Agents and Artificial Intelligence
…
9 pages
1 file
Intelligent Wheelchair (IW) is a new concept aiming to allow higher autonomy to people with lower mobility such as disabled or elderly individuals. Some of the more recent IWs have a multimodal interface, enabling multiple command modes such as joystick, voice commands, head movements, or even facial expressions. In these IW it may be very useful to provide the user with the best way of driving it through an adaptive interface. This paper describes the foundations for creating a simple methodology for extracting user profiles, which can be used to adequately select the best IW command mode for each user. The methodology is based on an interactive wizard composed by a flexible set of simple tasks presented to the user, and a method for extracting and analyzing the user's execution of those tasks. The results achieved showed that it is possible to extract simple user profiles, using the proposed method. Thus, the approach may be further used to extract more complete user profiles, just by extending the set of tasks used, enabling the adaptation of the IW interface to each user's characteristics.
Proceedings of the Third International Conference on Health Informatics, 2010
Journal of Intelligent & Robotic Systems, 2015
Intelligent wheelchairs (IW) are technologies that can increase the autonomy and independence of elderly people and patients suffering from some kind of disability. Nowadays the intelligent wheelchairs and the human-machine studies are very active research areas. This paper presents a methodology and a Data Analysis System (DAS) that provides an adapted command language to an user of the IW. This command language is a set of input sequences that can be created using inputs from an input device or a combination of the inputs available in a multimodal interface. The results show that there are statistical evidences to affirm that the mean of the evaluation of the DAS generated command language is higher than the mean of the evaluation of the command language recommended by the health specialist (p value = 0.002) with a sample of 11 cerebral palsy users. This work demonstrates that it is possible to adapt an intelligent wheelchair interface to the user even when the users present heterogeneous and severe physical constraints.
IRJET, 2022
As the population of the elderly and the disabled grows, so does the demand for care and support equipment to enhance their quality of life. The most popular mobility aid for people with limited mobility for the past 20 years has been the electric powered wheelchair (EPW), and more recently, the intelligent EPW, also known as an intelligent wheelchair (IW), has attracted significant attention as a new technology to meet users' varied needs. Elderly people and disabled people face a lot of difficulties in performing the simplest day to day tasks. Many of them rely on others or utilizing conventional technologies such as wheelchairs to accomplish tasks. With the help of modern technology and the advent of voice-enabled applications and devices we can build tools to help them interact with society and smooth their mobility during everyday activities. A major problem that they face is to reach the wheelchair, hence, to curb this issue we propose a mobile application that enables the user to locate and navigate the wheelchair towards themselves whenever they need it. The primary goal of the interactive user operated wheelchair system project is to provide a user- friendly interface by employing two ways of interaction with the wheelchair that is entering choice of direction through touch screen (haptic) and voice recognition input using speech recognition module to operate a wheelchair.
2005
We describe the development and assessment of a computer controlled wheelchair called the SMARTCHAIR. A shared control framework with different levels of autonomy allows the human operator to stay in complete control of the chair at each level while ensuring her safety. The framework incorporates deliberative motion plans or controllers, reactive behaviors, and human user inputs. At every instant in time, control inputs from these three different sources are blended continuously to provide a safe trajectory to the destination, while allowing the human to maintain control and safely override the autonomous behavior. In this paper, we present usability experiments with 50 participants and demonstrate quantitatively the benefits of human-robot augmentation.
—This work presents the development of a robotic wheelchair that can be commanded by users in a supervised way or by a fully automatic unsupervised navigation system. It provides flexibility to choose different modalities to command the wheelchair, in addition to be suitable for people with different levels of disabilities. Users can command the wheelchair based on their eye blinks, eye movements, head movements, by sip-and-puff and through brain signals. The wheelchair can also operate like an auto-guided vehicle, following metallic tapes, or in an autonomous way. The system is provided with an easy to use and flexible graphical user interface onboard a personal digital assistant, which is used to allow users to choose commands to be sent to the robotic wheelchair. Several experiments were carried out with people with disabilities, and the results validate the developed system as an assistive tool for people with distinct levels of disability.
2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013
In this paper, a method to perform semiautonomous navigation on a wheelchair is presented. The wheelchair could be controlled in semi-autonomous mode estimating the user's intention by using a face pose recognition system or in manual mode. The estimator was performed within a Bayesian network approach. To switch these two modes, a speech interface was used. The user's intention was modeled as a set of typical destinations visited by the user. The algorithm was implemented to one experimental wheelchair robot. The new application of the wheelchair system with more natural and easy-to-use human machine interfaces was one of the main contributions. as user's habits and points of interest are employed to infer the user's desired destination in a map. Erroneous steering signals coming from the usermachine interface input are filtered out, improving the overall performance of the system. Human aware navigation, path planning and obstacle avoidance are performed by the robotic wheelchair while the user is just concerned with "looking where he wants to go".
Assistive Technology, 2013
This paper focuses on evaluating the usability of an Intelligent Wheelchair (IW) in both real and simulated environments. The wheelchair is controlled at a highlevel by a flexible multimodal interface, using voice commands, facial expressions, head movements and joystick as its main inputs. A Quasiexperimental design was applied including a deterministic sample with a questionnaire that enabled to apply the System Usability Scale. The subjects were divided in two independent samples: 46 individuals performing the experiment with an Intelligent Wheelchair in a simulated environment (28 using different commands in a sequential way and 18 with the liberty to choose the command); 12 individuals performing the experiment with a real IW. The main conclusion achieved by this study is that the usability of the Intelligent Wheelchair in a real environment is higher than in the simulated environment. However there were not statistical evidences to affirm that there are differences between the real and simulated wheelchairs in terms of safety and control. Also, most of users considered the multimodal way of driving the wheelchair very practical and satisfactory. Thus, it may be concluded that the multimodal interfaces enables very easy and safe control of the IW both in simulated and real environments.
ACM Sigcaph Computers and The Physically Handicapped, 1999
TetraNauta is an on-going R&D project 1 aimed to develop a controller for standard electricpowered wheelchairs that permits users with very severe mobility restrictions (such as people with tetraplegy) to easily navigate in closed environments (home, hospital, school, etc.). This project intends to design a non-expensive guidance system to help this kind of users to drive the wheelchair with the minimum effort, but maintaining the user as active as possible. For this reason the design of the user interface is a key factor. Some characteristic of this interface can be taken as a workbench for the design of more complex and security critical mobile systems.
Lecture Notes in Computer Science, 2002
Smart wheelchairs are designed for severely motor impaired people that have difficulties to drive standard -manual or electric poweredwheelchairs. Their goal is to automate driving tasks as much as possible in order to minimize user intervention. Nevertheless, human involvement is still necessary to maintain high level task control. Therefore in the interface design it is necessary to take into account the restrictions imposed by the system (mobile and small), by the type of users (people with severe motor restrictions) and by the task (to select a destination among a number of choices in a structured environment). This paper describes the structure of an adaptive mobile interface for smart wheelchairs that is driven by the context.
Lecture Notes in Computer Science, 2010
With the rising concern about the needs of people with physical disabilities and with the aging of the population there is a major concern of creating electronic devices that may improve the life of the physically handicapped and elderly person. One of these new solutions passes through the adaptation of electric wheelchairs in order to give them environmental perception, more intelligent capabilities and more adequate Human -Machine Interaction. This paper focuses in the development of a user-friendly multimodal interface, which is integrated in the Intellwheels project. This simple multimodal human-robot interface developed allows the connection of several input modules, enabling the wheelchair control through flexible input sequences of distinct types of inputs (voice, facial expressions, head movements, keyboard and, joystick). The system created is capable of storing user defined associations, of input's sequences and corresponding output commands. The tests performed have proved the system efficiency and the capabilities of this multimodal interface.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Autonomous Robots, 2016
Proceedings of …, 2006
International Journal of Science and Research (IJSR), 2018
5th International Conference on Informatics in Control, Automation and Robotics, 2008
Wellcome open research, 2018
Journal of NeuroEngineering and Rehabilitation, 2013
Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference, 2013