Science

Next-Generation A.I. Could Develop Its Own 'Human Intuition'

Artificial intelligence is pretty good at performing tasks like driving cars, translating languages, and identifying faces. The next step is getting A.I. to react to the world with intuitive knowledge.

A new article published in A.I. trade publication Communications of the Association for Computing Machinery finds that instead of slogging through data sets carefully labeled to teach A.I. about the world, A.I. is starting to understand the world like people do. Research at Facebook and elsewhere opens the door for A.I. that is more intuitive, which Yann LeCun, director of A.I. research at Facebook, says is the key to improving artificial intelligence. The next level of A.I. will learn by observing, similar to how an infant learns.

“We don’t yet quite know how to reproduce this in machines, and that’s a shame,” LeCun told to the Association for Computing Machinery. “Until we learn how to do this, we will not go to the next level in AI”

Currently, most A.I. uses supervised learning, where it trains on enormous data sets that have been labeled meticulously, so the A.I. can identify faces or translate languages. Predictive learning is a new type of learning that is entirely unsupervised, so it doesn’t need labeled datasets. Instead, it observes and uses its observations to figure out how the world works.

Director of Facebook A.I. Research Yann LeCun

Getty Images / Brian Ach

LeCun had some choice words in response to a comment about how Elon Musk has a better vision of the future of A.I.

“It’s easy to pass as a visionary if the only things you do is talk,” LeCun replied to the comment on his Facebook post on predictive learning. “But if you are a self-respecting scientist or engineer, there is only so much talking you can do before you have to build things to support your claims.”

Essentially, it’s time to teach A.I. as if it was only a baby and see what happens. Research at Facebook is working to get A.I. to use vectors to connect characteristics of things together, in order to build this organic world-image.

Meanwhile, other labs have had neural networks compete to create the most photorealistic image. The A.I.s were then able to watch some frames of a movie and predict what the next frame of the video would look like. It’s a more intuitive knowledge, a sense of what is coming, instead of the confined intelligence of A.I. today. LeCun says figuring out how to effectively use this kind of learning is the way to get to the next level of artificial intelligence.

Related Tags