Using brain signals to remotely control objects is no longer the stuff of science fiction. In preliminary steps towards this outcome, AUT University's KEDRI (the Knowledge Engineering Discovery Research Institute) has demonstrated the remote control of a robot by facial expression.
The demonstrator wears a commercially available EEG (electroencephalograph) headpiece Emotiv EPOC with 14 sensors which measure brain signals and sends them wirelessly to a computer. Computer software then interprets the signals and translates them into commands which are wirelessly transferred to a small robot called WITH (WITH stands for the strange name "Wheel type mobile robot platform for intelligent behaviour" - developed by the Kyushu Institute of Technology, Japan). For example, looking left moves the robot left, looking right moves the robot right, blinking makes it stop and raising the eyebrows causes the robot to move forward.
Facial expressions are used because they generate strong signals that are quickly and easily picked up by the EEG headpiece and the translating software, says researcher Dr Stefan Schliebs. But brain signals emitted by thoughts about directing the robot could also be used. "The software and the person wearing the headpiece are continually learning and teaching each other so that the system improves with use," says Dr Schliebs.
KEDRI director Professor Nik Kasabov says while it is amazing to watch a robot controlled by facial expressions or thoughts, this is just a toy-like demonstration of a complex research project. "We are working with researchers from China and Europe on a large project with the idea of using KEDRI-developed artificial neural network models for personalised brain-computer interfaces and for the creation of brain-like artificial intelligence systems," says Professor Kasabov.
In the next few years the collaborative team will be using devices that more precisely capture brain signals than current headpieces and will be developing more sophisticated methods and software to recognise the complex patterns in these signals and turn them into commands.
"Some of the most exciting possibilities are yet to come," says Professor Kasabov. "In the future people will be able to use this technology to control objects like wheelchairs, prosthetic limbs and personalised rehabilitation robots. Thought commands can be transmitted wirelessly or via networks to objects at locations which are geographically distant from the person using the technology, which opens up more possibilities than we can imagine right now."
Click here to find out more.
Controlling objects with your mind
AdvertisementAdvertise with NZME.