Mind Reading Robots
|MIT's Baxter Controlled by Brain Signals & Hand Gestures|
A Robot More Obedient Than My Dog
Robotics is crossing another important threshold. Humans controlling robots with hand gestures and brainwaves is now a reality at MIT. It's a significant tightening of human-robot interactions that's a breakthrough. It prevents robots from making errors in real time. It happens through faint electrical brain signals and human gestures.
By monitoring human brain activity with electrodes in a cap (as seen above), the MIT system picks up whether the human user notices the robot is making a mistake. Spotting a mistake sends off a slight electrical signal in the brain. That signal is turned into an algorithm and code that the robot understands. You might say the robot is reading the human mind. Using an interface that measures muscle activity, humans can then make hand gestures to scroll through and select the right option for the robot to perform. The robot performs as directed through this new human partnership system. Researchers say the communication is seamless, natural and virtually automatic.
MIT Demonstration that's a First
In a demonstration, a robot moved a power drill to a target on a mock plane. There were three target choices and with the system the robot chose the right target 70% up to 97% of the time. At present the system works for right and wrong choices. As the technology matures, the communication will be greatly expanded.
The Name is Baxter
The robot used for the project is called Baxter, a humanoid robot from Rethink Robotics. What's exciting about the new approach is that the machine adapts to the user so there is no need to train the user to think in a prescribed way, as has been the case. The real world applications are significant and potentially across many industries.
No Need to Train
The system was developed by MIT's CSAIL (Computer Science and Artificial Intelligence Laboratory). The importance is that organizations can use it in real-world settings with no user training. The system proved it could work with humans it never interacted with before.
Highly Advanced Research
To create the system, researchers utilized electroencephalography (EEG) for brain activity and electromyography (EMG) for muscle activity monitored through a series of electrodes on the human users' scalp and forearms. Very advanced and it works.