Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
International Journal of Materials, Mechanics and Manufacturing
He is also a member of Bahrain NATIONAL Higher Education Skills-Innovation Steering Committee. Back on 1986, he received the bachelor of science in electrical eng. from University of Bahrain, pursued studies for MSc. in electronics in 1988 in University of Southampton, and in 1994, he was awarded a Ph.D. in cybernetics, robotics control from University of Reading with supervision of Prof. K. Warwick, professor of cybernetics. He worked on 13 research projects, including King Saud University Robotics Project, KSA. He supervised a number of (Ph.D., M.Sc., Undergraduate) students, and is currently working towards Brainwaves Decoding and Learning for robotics hand control. He got lots of awards, including
EEG based Brain Computer Interface (BCI) can be classified as one of the new communication channel that exists between the Human & Computers only through the biological signals by avoiding the use of muscular activity in association for executing different applications involved in it. There are many available technologies & interfaces that are facilitating in reading the bio-electrical signals from human brain associated with explicit commands in controlling various devices. In this work, a technological based application is developed in bringing an engineering solution in development of a conceptual framework, as a part of enhancement in remote controlled communication of a robot through Brain (EEG) signals interacted by the end-users.
2017
Paralysis is one amongst the major neural disorder that causes loss of motion of one or more muscles of the body, where in depending on the cause, it may affect a specific muscle group or region of the body or a larger area may be involved. In pursuit of rehabilitation, the eye can be regarded as one of the organs that can help a paralyzed person to communicate suitably. The Brain Signals of such patients can be used to help them communicate to others and also to perform various tasks by providing necessary infrastructure and training. This project describes the acquisition and analysis of Brain signals for operating a robot having a robotic arm mounted on top of it. The proposed method here uses a minimum number of electrodes for obtaining the brain signals using EEG Headsets available in the market and then control a robot based on the levels of these brain signals which can be varied by varying the states of mind. The EEG Headset detects the signals and generates a discrete value. This value is then sent over Bluetooth to a PC/ Laptop for further processing and plotting using MATLAB. After processing the actions to be performed are sent over ZIGBee to the ARM Microcontroller that controls the robot as well as the robotic arm mounted on the robot.
The latest trend within the brain wave technology has been mentioned in this paper, and the way the brain Wave controlled mechanism works based on Brain– Computer interfaces (BCI) conjointly discussed. BCIs are systems that can bypass standard channels of communication (i.e., muscles and thoughts) to produce direct communication and management between the human brain and physical devices by translating totally different patterns of brain activity into commands in real time. With these commands a mobile robot will be controlled. The intention of the project work is to develop a mechanism that can assist the disabled individuals in their everyday life to do some work independent on others. Here, we are analyzing the brain wave signals. Human brain consists of several interconnected neurons.
IOP Conference Series: Materials Science and Engineering, 2019
The purpose of this study is to discuss the brain wave system that can move the prosthetic arm based on brain wave activity. The sensor used to detect EEG brainwave activity uses a mobile mind wave sensor. Movement and detection of brainwave signals is carried out in the Lab VIEW application program. Plan this robot to make movements based on brain wave activity, utilizing blinks and attention. This research method through a process carried out to control the prosthetic arm. Where there are 2 modes, the first mode for the selection of movements with a blink of an eye, and the second mode of attention to move the fake arm. Based on research results Prosthetic arms can make movements that are designed for extension, flexion, supination or pronation and increase or depression. The prosthetic arm can make movements based on the subject's commands by utilizing brain wave activity. With a speed response time of 9.54 seconds to do all the moves. In addition to the advantages of this artificial arm, it can accommodate objects with a diameter of 2.2 cm to 6 cm. With an average percentage success of 6 experiments conducted by 86.67%.
Arxiv, 2022
This paper presents Open-source software and a developed shield board for the Raspberry Pi family of single-board computers that can be used to read EEG signals. We have described the mechanism for reading EEG signals and decomposing them into a Fourier series and provided examples of controlling LEDs and a toy robot by blinking. Finally, we discussed the prospects of the brain-computer interface for the near future and considered various methods for controlling external mechanical objects using real-time EEG signals.
This brain controlled robot is based on Brain-computer interfaces (BCI). BCIs are systems that can bypass conventional channels of communication (i.e., muscles and thoughts) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. The intention of the project work is to develop a robot that can assist the disabled people in their daily life to do some work independent on others.
Advances in Human-Computer Interaction, 2013
Humans have traditionally interacted with computers or machines by using their hands to manipulate computer components. his kind of human-computer interaction (HCI), however, considerably limits the human's freedom to communicate with machines. Over the years, many attempts have been made to develop technologies that include other modalities used for communication, for example, speech or g e s t u r e s ,t om a k eH C Im o r ei n t u i t i v e .R e c e n ta d v a n c e si n cognitive neuroscience and neuroimaging technologies in particular have allowed for the establishment of direct communication between the human brain and machines. his ability is made possible through invasive and noninvasive sensors that can monitor physiological processes relected in brain waves, which are translated online into control signals for external devices or machines. Such brain-computer interfaces (BCIs) provide a direct communication method to convey brain messages to an external device independent from the brain's motor output. hey are oten directed at assisting, augmenting, or repairing human cognitive or sensory-motor functions. In BCIs, users explicitly manipulate their brain activity instead of using motor movements in order to produce brain waves that can be used to control computers or machines. he development of eicient BCIs and their implementation in hybrid systems that combine well-established methods in HCI and brain control will not only transform the way we perform everyday tasks, but also improve the quality of life for individuals with physical disabilities. his is particularly important for those who sufer from devastating neuromuscular injuries and neurodegenerative diseases which may lead to paralysis and the inability to communicate through speech or gesture. In a BCI system, brain activity is usually recorded using a noninvasive neuroimaging technology, such as electroencephalography (EEG), magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), or nearinfrared spectroscopy (NIRS). In some cases, invasive technologies are used, such as electrocorticography (ECoG). In the majority of BCI systems, scalp EEG data are recorded, with the type of BCI system categorized based on the measure of brain activity used for BCI control. Each system has its own shortcomings and disadvantages. For instance, the information transfer rates of currently available noninvasive BCI systems are still very limited and do not allow for versatile control and interaction with assistive machines.
In this paper, feature extraction methods such as time domain, frequency domain and spatial domain were investigated. Where Mean Absolute Value (MAV), Integrated Absolute Value (IAV), Zero Crossing (ZC), Root Mean Square (RMS), Waveform Length (WL) and Slope Sign Change (SSC) are the used time domain features.
2020
1-3Student, Dept. of Computer Engineering, Terna Engineering College, Navi Mumbai, India. 4Professor, Dept. of Computer Engineering, Terna Engineering College, Navi Mumbai, India. ---------------------------------------------------------------------***---------------------------------------------------------------------Abstract The field of prosthetics has showed a significant improvement over last few years, due to advancement in technologies. However, they have certain problems either with being really expensive, does not provide full motor functions, may require surgical approach or does not look like an arm. This project describes how the Brain waves can be used to control a prosthetic arm using Brain Computer Interface (BCI). The BCI system consist of Electroencephalogram (EEG) sensors placed on the headset to capture the brain waves, which will be extracted using Thinkgear library in MATLAB. The Brain signal act as command signals and transmitted to microcontroller. This comma...
International Journal for Research in Applied Science and Engineering Technology IJRASET, 2020
There are approximately 21 million disabled folks in India, which is equivalent to 2.2% of the total population. These disabled individuals are impacted by numerous neuromuscular disorders. To enable them to express themselves, one can supply them with alternative and augmentative communication. For this, a Brain Computer Interface system (BCI) has been put together to deal with this particular need. The fundamental presumption of the project reports the design, building as well as a testing imitation of a man's arm which is designed to be dynamically as well as kinematically accurate. The delivered device tries to resemble the motion of the biological human hand by analyzing the signals produced by brain waves. The brain waves are actually sensed by sensors in the Neurosky headset and generate alpha, beta, and gamma signal. Then this signal is analyzed by the microcontroller and is then inherited on to the synthetic hand via servo motors. A patient that suffers from an amputee below the elbow can gain from this particular bio robotic arm.
Electrooculography is a technique for measuring the resting potential of the retina. The resulting signal is called the electrooculogramThe bio-potential signal also is one of the examples of human-machine interface using of nonverbal information such as electrooculography (EOG), electromyography (EMG), and electroencephalography (EEG) signals. The EOG and EMG signals are physiological changes; but here we are focusing the mainly on EOG signals for the human-machine interface. This papert has investigated that different EOG signals obtained from four different places around eye; (right, left, up, and down) have led to different level of distance and rotation of wheelchair. Those four signals are correspond to different levels of right and left steer, forward and backward motion. There are many research that have concentrated in making use of the eye movement signals for tetraplegia. Despite of all the complexity that arises when analyzing the eye movement signals. In this case the constraints are made such that the eye movement is assumes to be very limited to; (straight-to-up, straight-todown, straight-to-right and straight-to-left). The issue of other eye movement patterns.
Electrooculography is a technique for measuring the resting potential of the retina. The resulting signal is called the electrooculogramThe bio-potential signal also is one of the examples of human–machine interface using of nonverbal information such as electrooculography (EOG), electromyography (EMG), and electroencephalography (EEG) signals. The EOG and EMG signals are physiological changes; but here we are focusing the mainly on EOG signals for the human–machine interface. This papert has investigated that different EOG signals obtained from four different places around eye; (right, left, up, and down) have led to different level of distance and rotation of wheelchair. Those four signals are correspond to different levels of right and left steer, forward and backward motion. There are many research that have concentrated in making use of the eye movement signals for tetraplegia. Despite of all the complexity that arises when analyzing the eye movement signals. In this case the constraints are made such that the eye movement is assumes to be very limited to; (straight-to-up, straight-to-down, straight-to-right and straight-to-left). The issue of other eye movement patterns. Keywords—Brain computer interface, Electroculogram, Electrodes, Robotic Prototype Model
2013
Human brain mainly works on electric signals transmitting all over the body to send the information in order to operate the body parts. Even while rotating eye ball body increases or decreases the resistance near eye area. This variation in electric signals can be measured using electrodes or the myoelectric sensors. By implementing these signals processor we can interface different devices to control on demand. Hence proposed system is designed to control computer and hardware system using brain waves electric signals. Proposed systems will detection the variations in electric signal strength through voltage level near the eye area and generates a wireless radio frequency signals in order to control the robotic prototype model. By implementing this system we can further extend it to bio enabled human body parts to control through brain waves.
This paper describes about a brain controlled robot based on Brain–computer interfaces (BCI). BCIs are systems that can bypass conventional channels of communication (i.e., muscles and thoughts) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. Here the robot is self-controlled with the ultrasonic sensor. The intention of the project work is to develop a robot that can assist the disabled people in their daily life to do some work independent on others. Here, we are analysing the brain wave signals. Human brain consists of millions of interconnected neurons.
This project discusses about a brain controlled robot based on Brain Computer Interfaces (BCI). BCIs are systems that can bypass conventional channels of communication (i.e., muscles and thoughts) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. The intention of the project work is to develop a robot that can assist the disabled people in their daily life to do some work independent of others. Here, we analyze the brain wave signals. Human brain consists of millions of interconnected neurons. The pattern of interaction between these neurons is represented as thoughts and emotional states. According to the human thoughts, this pattern will be changing which in turn produce different electrical waves. Muscle contraction also generates unique electrical signals. All these electrical waves will be sensed by the brain wave sensor and converts the data into packets and transmits through Bluetooth medium. The brain wave raw data is sent to the computer and it will extract and process the signal using MATLAB platform. Then the control commands will be transmitted to the robot module to process. With this entire system, we can move a robot based on the human thoughts and it can be turned by blink muscle contraction.
Kalpa Publications in Engineering
This research paper presents to develop a bio-signal acquisition system and rehabilitation technique based on “Cognitive Science application of robot controlled by brain signal”. We are trying to Developing a data acquisition system for acquiring EEG signals from Brain sense head band and also designing new algorithm for detecting attention and meditation wave and implementing on Robotics platform By using Embedded core.
This paper presents a proof-of-concept study to control an educational neuroprosthetic robotic hand using brain electrical signals. The slightly modified version of our previous brain-machine interface (BMI) was linked to the recently designed cost-effective robotic hand by sending realtime commands to the robot and simultaneously to the simulated hand in a virtual reality environment. The system was validated experimentally using electroencephalograph (EEG) signals. Signals were recorded from seven positions over the motor cortex while the subjects performed a cue-based imagination of hand grasp and relaxation, and received biofeedback. We improved signal processing algorithms to extract the needed information to classify the control command in a simulated study. Single classiffier (all features + one classifier) and ensemble classifiers (one classifier per channel + voting) techniques were compared using Matthew's correlation coefficient (MCC) factor in addition to the popular classification accuracy. The primary results show that the simulated online accuracy was significantly higher for the ensemble SVM, 71.24%, than for the simple SVM, 66.56% t(9) = 4.78, p < 0.001 over all subjects.
Journal of Software, 2009
Brain-Computer Interface (BCI) has added a new value to efforts being made under human machine interfaces. It has not only introduced new dimensions in machine control but the researchers round the globe are still exploring the possible uses of such applications. BCIs have given a hope where alternative communication channels can be created for the persons having severe motor disabilities. This work is based upon utilizing the brain signals of a human being via scalp Electroencephalography (EEG) to get the control of a robot's navigation which can be visualized as controlling one's surrounding environment without physical strain. In this work when a person thinks of a motor activity, it gets performed. The procedure includes acquisition and analysis of brain signals via EEG equipment, development of a classification system using AI techniques and propagating the subsequent control signals to Lego-robot via parallel port. This has been depicted in [1] as a generic block diagram.
It should be noted that eye movements and breathing may cause considerable artifacts in slow potentials while muscular tension in face and neck can generate artifacts in higher frequencies. Also, electroencephalography rhythms have response latencies of about 0.5 seconds whereas other electroencephalography components e.g slow potentials and event-related potentials such as P300 and Steady State Visually Evoked Potential have response latencies of two or more seconds. • Space Applications: development of non-invasive brain-controlled robotic devices is the most relevant for space applications, where environment is inherently hostile and dangerous for astronauts who could greatly benefit from direct mental teleoperation of external semi-automatic manipulators. Furthermore, robotics aids would be highly useful to astronauts weakened by long stays in microgravity environments. Electroencephalography signals in a microgravity environment may differ from those recorded in a normal gravity environment for the same person. As a result, the feasibility of Brain computer interface for space applications should be tested in microgravity conditions and, depending on the results, new solutions should be explored that fit better space environments.
2010
Brain-Machine Interface is a technology that allows people to control devices using only the bioelectrical signals from the brain. The challenge has been around since 1973, and the first experimental proof of the feasibility of the technology was given in 1988. However, the real worldwide interest was shown in the 21 st century. Currently, there are research laboratories and companies around the world offering research and products in the area. The technology allows recognizing various states of the human brain through brain signal processing. The applications so far included movement of the cursor, hands-free typewriter, wheelchair (robot) movement, and robot arm (prosthesis) movement, among others. Here, an investigation is reported, in which BrainMachine Interface is used based on anticipatory brain potentials. The device controlled is a robotic arm.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.