Science

DARPA is Working on Neural Interfaces to Merge Humans and A.I.

healthblog/Flickr

The United States military is working on its own version of Elon Musk’s neural lace that might one day allow implanted chips as powerful as laptops to receive neural data and transmit device-controlling signals. The Defense Advanced Research Projects Agency, an agency of the Department of Defense that is working on the technology, also wants to use brain interface systems to help people with injuries recover their memory abilities.

“We absolutely have people working on that now,” Justin Sanchez, director of the Biological Technologies Office at the agency, said in an article published Tuesday. “Direct neural interfaces are being developed.”

Sanchez envisions a future where instead of getting your phone out, looking up a nearby Uber and tapping in your home address, a user could think the command “open up the Uber app and take me home.” Such a system could use artificial intelligence to understand complex commands. As DARPA is aimed at developing military technologies, the team is also considering how such technologies could help a soldier learn a new language faster than before, for example.

Merging man and machine may sound like science fiction, but in some ways, it’s already happening. Wearables are already measuring people’s vital signs and controlling apps depending on the user’s input. The Apple Watch, for example, disables contactless payments if it can’t detect a pulse. In the future, it’s not unbelievable to imagine a wearable that controls your thermostat based on body temperature.

“That’s where this merging of humans and machines is heading,” said Sanchez. “Having the environment, the thermostat, change as the function of our physiology - that’s near term.”

Researchers have already experimented with using electroencephalogram brain signals to control machines. The Virtual Embodiment and Robotic Re-Embodiment, an Italy-based robotics project, said in October that they had created a robot that responds to brain waves, where patients could move the robot’s arms by thinking about how they wanted them to move. One patient reported a “feeling of control and increased embodiment.”

The same technology has been developed into a Netflix-controlling system called MindFlix, which lets a user pick out a film by thinking about basic navigation controls. Brain signals have also been used to control drones, like this project at Arizona State University from July that lets users control swarms of drones with their minds:

DARPA’s end goals may sound ambitious, but in many ways it’s a natural progression of existing technologies. Sanchez claims that neural interfaces that help people with brain injuries may arrive as soon as three years’ time. In a few years from now, the idea of controlling computers with our minds may not seem so weird after all.

Related Tags