Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
This paper describes about a brain controlled robot based on Brain–computer interfaces (BCI). BCIs are systems that can bypass conventional channels of communication (i.e., muscles and thoughts) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. Here the robot is self-controlled with the ultrasonic sensor. The intention of the project work is to develop a robot that can assist the disabled people in their daily life to do some work independent on others. Here, we are analysing the brain wave signals. Human brain consists of millions of interconnected neurons.
International Journal of Computer Applications, 2014
This paper describes the Mind Controlled Robot based on Brain Computer Interface (BCI) using LabVIEW to analysis the brain waves. BCIs are systems that may bypass typical channels of communication (i.e., muscles and thoughts) to supply direct communication and management between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. The intention of the project work is to develop a mechanism that may assist the disabled folks in their everyday life to do some work freelance on others. Here, they tend to are analyzing the brain wave signals. Human brain consists of innumerable interconnected neurons. The patterns of interaction between these neurons are delineating as thoughts and emotional states. In step with the human thoughts, this pattern are going to be dynamical that successively manufacture totally different electrical waves [1].
The latest trend within the brain wave technology has been mentioned in this paper, and the way the brain Wave controlled mechanism works based on Brain– Computer interfaces (BCI) conjointly discussed. BCIs are systems that can bypass standard channels of communication (i.e., muscles and thoughts) to produce direct communication and management between the human brain and physical devices by translating totally different patterns of brain activity into commands in real time. With these commands a mobile robot will be controlled. The intention of the project work is to develop a mechanism that can assist the disabled individuals in their everyday life to do some work independent on others. Here, we are analyzing the brain wave signals. Human brain consists of several interconnected neurons.
This project discusses about a brain controlled robot based on Brain Computer Interfaces (BCI). BCIs are systems that can bypass conventional channels of communication (i.e., muscles and thoughts) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. The intention of the project work is to develop a robot that can assist the disabled people in their daily life to do some work independent of others. Here, we analyze the brain wave signals. Human brain consists of millions of interconnected neurons. The pattern of interaction between these neurons is represented as thoughts and emotional states. According to the human thoughts, this pattern will be changing which in turn produce different electrical waves. Muscle contraction also generates unique electrical signals. All these electrical waves will be sensed by the brain wave sensor and converts the data into packets and transmits through Bluetooth medium. The brain wave raw data is sent to the computer and it will extract and process the signal using MATLAB platform. Then the control commands will be transmitted to the robot module to process. With this entire system, we can move a robot based on the human thoughts and it can be turned by blink muscle contraction.
EEG based Brain Computer Interface (BCI) can be classified as one of the new communication channel that exists between the Human & Computers only through the biological signals by avoiding the use of muscular activity in association for executing different applications involved in it. There are many available technologies & interfaces that are facilitating in reading the bio-electrical signals from human brain associated with explicit commands in controlling various devices. In this work, a technological based application is developed in bringing an engineering solution in development of a conceptual framework, as a part of enhancement in remote controlled communication of a robot through Brain (EEG) signals interacted by the end-users.
This brain controlled robot is based on Brain-computer interfaces (BCI). BCIs are systems that can bypass conventional channels of communication (i.e., muscles and thoughts) to provide direct communication and control between the human brain and physical devices by translating different patterns of brain activity into commands in real time. With these commands a mobile robot can be controlled. The intention of the project work is to develop a robot that can assist the disabled people in their daily life to do some work independent on others.
Journal of Software
Nowadays the UX design become on a next level. Together with new way of interaction are introduced as finger and hand movement. The technology offer and thought-driven approach with so called brain-computer interface (BCI). This possibility opens new challenges for uses as well as for designers and researchers. More than 15 years there are devices for brain signal interception, such as EMotiv Epoc, Neurosky headset and others. The reliable translation of user commands to the app on a global scale, with no leaps in advancement for its lifetime, is a challenge. It is still un-solve for modern scientists and software developers. Success requires the effective interaction of many adaptive controllers: the user's brain, which produces brain activity that encodes intent; the BCI system, which translates that activity into the digital signals; the accuracy of aforementioned system, computer algorithms to translate the brain signals to commands. In order to find out this complex and monumental task, many teams are exploring a variety of signal analysis techniques to improve the adaptation of the BCI system to the user. Rarely there are publications, in which are described the used methods, steps and algorithms for discerning varying commands, words, signals and etc. This article describes one approach to the retrieval, analysis and processing of the received signals. These data are the result of researching the capabilities of Arduino robot management through the brain signals received by BCI.
A brain-computer interface (BCI), sometimes called a direct neural interface or a brain machine interface, is a direct communication pathway between a human and an external device. This is one of the most challenging concepts in the present day scenario. Following up on the preliminary work on the SRM BCI last year this project’s main purpose is to simulate a microcontroller with the help of brain wave patterns to trigger a specific course of action. The Electroencephalograph has chosen for the purpose of measuring the impedance of the brain waves, thereby generating an EEG graph whose Fast Fourier Transform in the form of Brain maps gives us a reliable and effective way of controlling and understanding the brainwave generation patterns. The basic aim of this project is to trigger a specific course of action on a microcontroller circuit. By conducting experiments, we found out that simple activities like the blinking of an eye, movement of legs, and movement of arms produced a specific wave in the brain. The peaks in the waveform denote the action, thereby triggering the course of action. In the long run, the triggering could be used in highly complicated circuits like an advanced security system. This will also be highly beneficial for the physically challenged subjects who can move a wheelchair on their own by just eye blinking or movement of either an arm or a leg. Since there are very limited ’existing systems’ available our project would be one of the pioneering works in the field of BCI (Brain Computer Interface) with a great “Social Impact”.
A brain-computer interface (BCI) establishes a link between the human brain and the external devices. BCIs measure the brain activity for fetching the user’s intent and subsequently provide the control signals to the supporting hardware. This technology has varied uses ranging from assistive devices for disabled individuals to advanced simulator control. The main use of BCI is as an assistive technology for individuals suffering from loss of motor control caused by spinal cord injury, amyotrophic lateral sclerosis or any other possible incidence. BCIs take advantage of the brain’s electrochemical signals. There are billions of neurons in human brain with trillions of interconnections known as synapses. These devices also make use of neuroplasticity which is the brain’s ability to change physically and functionally over time. Author has discussed the basics of BCI in this paper and has presented details regarding brain waves, control centers of various organs in brain, invasive and non-invasive sensors. This paper also presents a summary of the research work going on in this area.
2021
The objective of this paper is to aid the patients to achieve a command based movement of wheelchair using Electroencephalogram (EEG) signals. A wheelchair is developed with a BCI system to help the below neck paralyzed. In such patients, brain fails to interact with the external environment.. A Brain Controlled Wheelchair provides mobility to locked-in patients with the help of BCI in a safe and efficient way. In this proposed work, the EEG signals are detected from the brain through the connected headset. The patient makes the decision for movement and blinks his/her eyes accordingly. Once the decision is made for the movement, the eye blinks are detected and a signal corresponding to that particular direction is sent to the controller via bluetooth. The received signals are analyzed and moves the wheelchair accordingly. Wheelchair prototype is constructed using DC motors fitted onto a platform using L brackets, screws and nuts. The microcontroller, bluetooth module and ultrasonic...
International Journal of Engineering Sciences & Research Technology, 2013
Brain is the place where we think and feel. When a particular arteries supply more blood to that part of brain. Medical scanners are used to observe the changes in blood transit in the brain. Using this technique, we can actually study how does brain work. A single brain cell it is a tiny building block of the brain. The brain has 100 billion of such neurons and is responsible to fire an electrical impulse that prompt our thoughts. These electrical signals can be captured and converted into actions to help disabled people to deal with their troubles and obstacles in daily life. It is established by the use of Brain Computer Interfaces this miracle would be achieved. This paper deals with how a human brain can control the computational devices and this revolution would help robotics domain and disabled people to get timely assistance with mini physical efforts. Abstract
2017
Paralysis is one amongst the major neural disorder that causes loss of motion of one or more muscles of the body, where in depending on the cause, it may affect a specific muscle group or region of the body or a larger area may be involved. In pursuit of rehabilitation, the eye can be regarded as one of the organs that can help a paralyzed person to communicate suitably. The Brain Signals of such patients can be used to help them communicate to others and also to perform various tasks by providing necessary infrastructure and training. This project describes the acquisition and analysis of Brain signals for operating a robot having a robotic arm mounted on top of it. The proposed method here uses a minimum number of electrodes for obtaining the brain signals using EEG Headsets available in the market and then control a robot based on the levels of these brain signals which can be varied by varying the states of mind. The EEG Headset detects the signals and generates a discrete value. This value is then sent over Bluetooth to a PC/ Laptop for further processing and plotting using MATLAB. After processing the actions to be performed are sent over ZIGBee to the ARM Microcontroller that controls the robot as well as the robotic arm mounted on the robot.
Kalpa Publications in Engineering
This research paper presents to develop a bio-signal acquisition system and rehabilitation technique based on “Cognitive Science application of robot controlled by brain signal”. We are trying to Developing a data acquisition system for acquiring EEG signals from Brain sense head band and also designing new algorithm for detecting attention and meditation wave and implementing on Robotics platform By using Embedded core.
Journal of Robotics and Control (JRC), 2021
The development of the world of robotics is inevitable with the rapid development of supporting science and technology. There are various types and classifications of robots, although the basic development is not much different. One type of robot that is in demand and the most widely developed is the wheeled robot. The robot component itself is generally divided into 3 parts, the first sensor, the second processor or component processor and actuator, in this study which behaves as an actuator is a wheel, while that behaves as a sensor, researchers utilize brainwave reader headsets from neurosky, and those that served as a processor component or processor using Arduino Uno R3. The neurosky headset works wirelessly using a Bluetooth connection, and the data sent is in the form of a brain wave power level (blink streght level). Before it can be translated into a telepathic brain command, this signal is first captured and processed using an android handset using an application that is built based on blynk IoT, then after that the command is sent to Arduino as a robot processing component that has previously been fitted with HC-06 bluetooth module hardware. to capture wireless broadcasts from an android device, only after that the signal is processed by Arduino becomes a command to move forward, backward, left, right wheeled robot by the L298N motor driver. The test results in an ideal environment showed an average system success of 85%, while testing in a non-ideal environment (with obstacles of space and distance) showed an average system success of 40% with each test carried out 10 times.
There are number of physically handicapped people. Some of them are using different technologies to move around. The proposed work implements a robot which is controlled using human brain attention.Here brain signals analyzes using electrode sensor that monitors the eye blinks and attention level. Brain wave sensor that detects these EEG signals is transmitting through Bluetooth medium. Level analyzer unit (LAU) i.e. computer system will receive the raw signals and process using MATLAB platform. According to human attention level robot will move. ARM controller is used to design robot.
Arxiv, 2022
This paper presents Open-source software and a developed shield board for the Raspberry Pi family of single-board computers that can be used to read EEG signals. We have described the mechanism for reading EEG signals and decomposing them into a Fourier series and provided examples of controlling LEDs and a toy robot by blinking. Finally, we discussed the prospects of the brain-computer interface for the near future and considered various methods for controlling external mechanical objects using real-time EEG signals.
Brain-computer interface (BCI) system provides communication channel between human brain and external devices. The system processes and translates thought into control signals and thus enabling a user to navigate a robot from one place to another. In this context, we developed a system that enables a user to guide a robot by brain waves. The system consists of an Emotiv Epoc headset, a personal computer, and a mobile robot. The Emotiv Epoc headset attached to the head of the user and used to collect Electroencephalogram (EEG) signals. The headset picks up brain activities from 14 locations on the scalp and sends them to the computer for processing. Those brain activities can tell the system what a person is going to do in his virtual reality. Then, by using a novel application designed for this purpose, the cognitive suite supplied by Emotiv is responsible for generating the control actions needed to make the robot execute three different commands: turn right, turn left, and move forward. In this paper, hardware and software architectures were designed and implemented. Experimental results indicate that the robot can be successfully controlled in real-time based on the subject's physiological changes. Keywords: brain computer interface (BCI), electroencephalogram (EEG), emotiv epoc neuroheadset.
2020
The locomotive disabled people and elderly people cannot control the wheelchair manually. The key objective of this paper is to help the locomotive disabled and old people to easily manoeuvre without any social aid through a brainwave-controlled wheelchair. There are various types of wheelchair available in the market such as Voice controlled wheelchair, Joystick control wheelchair, Smart phone controlled wheelchair, Eye controlled wheelchair, Mechanical wheelchair. These wheelchairs hold certain limitations for e.g. if the user is dumb; user cannot access voice controlled wheelchair, etc. Brain-computer interface (BCI) is a new method used to interface between the human mind and a digital signal processor. An Electroencephalogram (EEG) based BCI is connected with an artificial reality system to control the movement and direction of a wheelchair. This paper proposes brainwave controlled wheelchair, which uses the captured EEG signals from the brain. This EEG signals are then passed ...
Reegan R
The aim of the project is to enhance the interactivity for controlling a mind-controlled Robotic ARM using Brain-Computer Interface (BCI) technology in an open-source environment by adding a virtual world input, enabling users to interact with the real world and making the entire process more user-friendly. The project proposes a new solution for human-robot interaction by incorporating a smart chip implanted in the radial nerve of the human brain, replacing the current EEG wearable helmet. The chip will contain EEG technology and a Bluetooth extension, allowing for real-time, non-invasive control of the robotic arm through EEG signals captured by the BCI technology. The Bluetooth extension will provide wireless communication capabilities, enabling physically challenged individuals to perform day-to-day activities with greater ease and independence. The use of the smart chip in the radial nerve provides a more natural and intuitive method of control compared to the current EEG wearable helmet, as the implantation of the chip allows for real-time, non-invasive control of the robotic arm through EEG signals captured by the BCI technology.
2016 International Conference on Informatics and Computing (ICIC), 2016
We introduce the design and preliminary implementation of low-cost brain-computer interface (BCI) to enable movement of a rolling robot. This system will accept and execute basic commands that are generated from three brain condition normal, relax, or happy. The brain condition is determined by brainwaves known as alpha, beta, and gamma waves. The brainwaves are detected using Mindflex, connected with Arduino Uno, and sent data transmission to computer via USB connection. Algorithm is developed using Matlab to analyze the data and send three simple commands to a rolling robot through Wi-Fi connection. We conducted experiment for fifty times using 100 data for training and 50 data for testing, and obtained 62% of accuracy. This result shows that brainwave commands can be processed successfully for forward, turn left, and turn right movements. We have concluded that the system can be developed further to assist disabilities performing motions using their minds.
Advances in Human-Computer Interaction, 2013
Humans have traditionally interacted with computers or machines by using their hands to manipulate computer components. his kind of human-computer interaction (HCI), however, considerably limits the human's freedom to communicate with machines. Over the years, many attempts have been made to develop technologies that include other modalities used for communication, for example, speech or g e s t u r e s ,t om a k eH C Im o r ei n t u i t i v e .R e c e n ta d v a n c e si n cognitive neuroscience and neuroimaging technologies in particular have allowed for the establishment of direct communication between the human brain and machines. his ability is made possible through invasive and noninvasive sensors that can monitor physiological processes relected in brain waves, which are translated online into control signals for external devices or machines. Such brain-computer interfaces (BCIs) provide a direct communication method to convey brain messages to an external device independent from the brain's motor output. hey are oten directed at assisting, augmenting, or repairing human cognitive or sensory-motor functions. In BCIs, users explicitly manipulate their brain activity instead of using motor movements in order to produce brain waves that can be used to control computers or machines. he development of eicient BCIs and their implementation in hybrid systems that combine well-established methods in HCI and brain control will not only transform the way we perform everyday tasks, but also improve the quality of life for individuals with physical disabilities. his is particularly important for those who sufer from devastating neuromuscular injuries and neurodegenerative diseases which may lead to paralysis and the inability to communicate through speech or gesture. In a BCI system, brain activity is usually recorded using a noninvasive neuroimaging technology, such as electroencephalography (EEG), magnetoencephalography (MEG), functional magnetic resonance imaging (fMRI), or nearinfrared spectroscopy (NIRS). In some cases, invasive technologies are used, such as electrocorticography (ECoG). In the majority of BCI systems, scalp EEG data are recorded, with the type of BCI system categorized based on the measure of brain activity used for BCI control. Each system has its own shortcomings and disadvantages. For instance, the information transfer rates of currently available noninvasive BCI systems are still very limited and do not allow for versatile control and interaction with assistive machines.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.