You are using an older browser version. Please use a supported version for the best MSN experience.

Robots that read our minds will actually exist sooner than you think

BGR logo BGR 6/20/2018 Chris Smith
a red suitcase © Provided by BGR

Robots are here to stay, and they’re only getting smarter. But it’ll be a while until they respond to our every verbal command, so until then we’ll have to use our mind to communicate with them.

That’s right, you’ll be able to control robots using your mind in the future, as MIT figured out a way to combine brainwaves and hand gestures to allow humans to interact with machines effortlessly.

The idea here is to allow machines to easily correlate and interpret the brain signals and hand movements of a person and turn them into quick robotic actions. That way, humans would not have to master coding skills required to preprogram robots to perform specific tasks in direct response to human interaction.

The technology is still in its infancy and requires a human to wear a couple of cumbersome devices that would measure the electrical activity of the brain and the muscular activity of the hand. That’s how the robot “reads” our minds and muscle movements.

But MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) was quite successful.

“This work combining [electroencephalograph (EEG) and electromyography (EMG)] feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” CSAIL director Daniela Rus said. “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

The team, led by Ph.D. candidate Joseph DelPreto, used a humanoid robot called Baxter, from Rethink Robotics, during testing, and the project was funded in part by the Boeing Company.

The robot went from choosing the correct target from 70% to 97%, MIT says — see video below. While robots can interpret EEG and EMG to trigger actions, it’s the combination of the two that makes it all possible.

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,” DelPreto said. “This helps make communicating with a robot more like communicating with another person.”

The system could be used in the future by various types of people, including workers with disabilities or limited mobility, but also the elderly, according to the team. And who knows, this type of robot control could one day be used to conquer space.

A paper detailing MIT CSAIL’s invention will be presented at the Robotics: Science and Systems (RSS) conference in Pittsburgh next week.

AdChoices
AdChoices

More from BGR

image beaconimage beaconimage beacon