Dr. Vijayan Asari, DirectorPhone: 937-229-4504
EEG: Brain Machine Interface
EEG Signal Analysis for Brain-Machine Interface
Imagine the possibilities of interacting with machines directly with our minds in an intuitive manner. Victims of paralysis or coma patients would have an outlet through which they could still communicate their thoughts. Amputees could be provided with better prosthetic limbs, new arms and legs, that interact with the mind in a similar manner to natural thought.
This project uses an Electroencephalogram (EEG) to collect signals from the brain as a person thinks of actions. These signals are then compared and categorized into simple actions such as lift, turn, grab, pull, push, or drop, among others. This will be done by training a program to recognize patterns in the EEG readings that can be reliably associated with one of the basic actions. This training can be used to build a library of recognized actions which can then be translated into commands for an external system such as a robot.
The most central challenge of the project is recognizing the pattern in the EEG signals behind a specific action in a reliable manner. In addition to this there is also the challenge of recognizing that same action based on readings from a different person. Another challenge that could have a significant impact is distinguishing between someone thinking of lifting an object and the thoughts behind actually lifting an object.
Once it's determined how these different variables impact the action encoding, these challenges become new strengths and features. If the program has to be trained for each individual then the technology could be used to recognize the correct owner of a piece of technology like a car. If the thoughts asking a vehicle with a brain machine interface belong to a stranger then that prevents the stranger from driving the car away.
This research has numerous possible applications. If the thoughts can be successfully encoded into actions it could be used by the handicapped to control robots to complete tasks. Another application is potential use in communicating with coma patients. The ability to detect what they want to do, even if they can't physically do so, would be an immensely valuable communication tool. In the future, further research could even make it possible to recognize the complex thoughts behind speech and manifest those thoughts through tools like a voice synthesizer. Robots could also be controlled to do jobs too dangerous for humans through thoughts. Encoding the thoughts behind actions into reliable data would undeniably improve the ability of humans to interact with machines, making machines more efficient and effective tools.