Science

A Brain Reading Robot Can Collaborate With Humans on Multiple Choice Tests

Robot servants are one step closer.

Communicating with machines and getting them to do what we need them to do is how a lot of us make a living, and yet despite a great deal of progress in the field of robotics and A.I., interacting with digital worlds still requires having a controller in-hand.

That may be about to change thanks to the latest breakthrough from a robot named Baxter who can now read gestures and minds well enough to collaborate with humans on multiple choice tests. That’s according to a team of researchers at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory who are presenting their new paper at the Robotics: Science and Systems conference starting June 26.

“Right now when we want to work with robots or with our tech there’s sort of a language barrier, where we need to adapt to the robot,” first-author Joseph DelPreto tells Inverse. “We need to learn its language, we need to learn keywords, use controls, buttons, or learn programming but we’d like to be able to interact with robots more similarly with how we interact with other people.”

MIT CSAIL

This study advances previous research by MIT CSAIL which taught Baxter how to read our thoughts by using an electroencephalography monitor, or EEG. DelPreto’s work incorporated electromyography, or EMG, to allow our smirky friend to pick up on hand gestures.

The EEG helmet detects brain signals called “error-related potentials,” or ErrPs. These occur when humans notice a mistake. Think of them like that voice in your head that says, “Don’t do that” when you notice someone about to mess up.

If an ErrPs is detected the robot immediately stops what it’s doing and waits for a hand motion to proceed. In this case, Baxter was trying to drill a hole in one of three places and if he was stopped, he would wait for the human subject to point either left or right before resuming, a key step toward robots that can respond to complicated and nuanced forms of communication.

MIT CSAIL

“We’re excited to be able to interact with robots in a more natural way and you can imagine a lot of different applications for that,” says DelPreto. “In this case, we were looking at supervisory roles. You can imagine developing that further to be more continuous with more fluid motions. Down the line, you could have robotic assistants with personal care or in construction and factories.”

This method for bending robots to our every whim can be used on any type of hardware and anyone one could theoretically slip on the EEG cap to control their very own Baxter. The goal ultimate goal is to continue improving this system enough so that it can pick up on more subtle brain activity, or interpret more detailed hand gestures. That would open up the door to a plethora of use-cases.

Robot factory workers could be controlled with just a thought. Mechanical home assistants could bring you a glass of wine when you point at the bottle. Or better yet, you can play an epic game of fetch with a pack of Boston Dynamic dogs. The possibilities are limitless.

Related Tags