Thoughtful technology: We can now control robots — with our minds

SYDNEY — Scientists from the University of Technology Sydney have developed new biosensor technology that actually makes mind reading possible! No, not like a fortune teller; this new technology allows people to operate devices, such as robots and machines, solely via thought-control. You think, and the robot acts.

Researchers add this exciting breakthrough holds positive implications for the fields of healthcare, aerospace, and advanced manufacturing. This advanced brain-computer interface was developed by Distinguished Professor Chin-Teng Lin and Professor Francesca Iacopi, from the UTS Faculty of Engineering and IT, in collaboration with the Australian Army and Defence Innovation Hub.

The astounding technology was recently demonstrated by the Australian Army. Soldiers operated a Ghost Robotics quadruped robot using the new brain-machine interface. That demonstration displayed hands-free command of the robotic dog with up to 94 percent accuracy.

Besides its potential use for defensive and military purposes, the technology holds serious promise in fields such as advanced manufacturing, aerospace, and healthcare. For example, helping people with disabilities control a wheelchair or operate prosthetics.

“The hands-free, voice-free technology works outside laboratory settings, anytime, anywhere. It makes interfaces such as consoles, keyboards, touchscreens and hand-gesture recognition redundant,” Professor Iacopi says in a university release. “By using cutting edge graphene material, combined with silicon, we were able to overcome issues of corrosion, durability and skin contact resistance, to develop the wearable dry sensors.”

A report detailing the new tech was just released, showing the that the graphene sensors developed at UTS are easy to use, robust, and very conductive. Hexagon patterned sensors are positioned over the back of the scalp. This is done to detect brainwaves from the visual cortex. The sensors are quite resilient to harsh conditions, and can even be used in extreme operating environments.

Users wear a head-mounted augmented reality lens that displays white flickering squares. When one concentrates on a specific square, the brainwaves of the operator are picked up by the biosensor, and a decoder translates the signal into commands.

“Our technology can issue at least nine commands in two seconds. This means we have nine different kinds of commands and the operator can select one from those nine within that time period,” Professor Lin explains. “We have also explored how to minimize noise from the body and environment to get a clearer signal from an operator’s brain.”

In conclusion, study authors posit this technology will be of great interest to the scientific community, industry and government. Moving forward, they hope to continue making advances in brain-computer interface systems.

The study is published in ACS Applied Nano Materials.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *