Science

Facebook Can Understand "With Near-Human Accuracy" Our Trivial Status Updates

Facebook today announced the development of DeepText, which is learning how to understand us.

Getty Images / Chris Jackson

Today, Facebook announced the development of DeepText, “a deep learning-based text understanding engine” that its engineers say can understand almost as well as your mom (who you are friends with on Facebook) your feelings about the death of Harambe.

In a lengthy post on its engineering blog, Facebook points out that it’s all about context when it comes to text, which is the primary source of user-generated information on Facebook. And if its A.I. can’t differentiate between your love for blackberry the fruit and your dedication to BlackBerry the phone, it can’t do much in the way of catering your Facebook experience so that it’s most useful to you:

“Understanding the various ways text is used on Facebook can help us improve people’s experiences with our products, whether we’re surfacing more of the content that people want to see or filtering out undesirable content like spam.”

Below is a preview of how it’ll work. Notice that after the user types “I need a ride,” a suggestion comes up to request a ride (a feature Facebook debuted last year), but if you type “I like to ride donkeys,” no such suggestion comes up:

DeepText is being used across some twenty languages.

If any of this sounds familiar — that’s because it is! At Google’s recent I/O developer conference, the Facebook rival showed off Allo, a messaging service that, from a user’s point of view, might seem pretty similar. By default, Allo will suggest responses to images or messages received: If you get a message from your mom asking if you want dinner later, you’ll see some suggested responses based on Google’s massive amount of data (“I’m in” or “I’m busy”). However, there’s the pesky matter of encryption and the fact that you have to turn it on with Allo.

By the way, here’s how Allo will look:

Google

If you’re using Facebook Messenger, you already know it’s not encrypted — even though it’ll reportedly introduce end-to-end encryption this summer. It seems impossible that Facebook’s DeepText engine would have access to your texts if you had end-to-end encryption turned on, so — like with Google — if you want helpful suggestions, you’ll likely have to sacrifice your absolute privacy. (May we suggest you switch to Whatsapp?)

So what about images and videos, can Facebook’s A.I. make sense of those? The DeepText team writes it is working with Facebook’s “visual content understanding teams” to “build new deep learning architectures that learn intent jointly from textual and visual inputs.” That team’s research lead, Manohar Paluri, told Inverse in March that they were making progress on that front. Meantime, you’ll have to caption your photos if you want DeepText to understand them.

Facebook also announced it’s using public Facebook pages — say, like the one for Inverse — as a testing ground for generating large data sets. Any text on a public Facebook page would be used to help teach DeepText how language is used.

It’s going to be an interesting few months for Facebook as DeepText keeps learning about how we communicate with text, and the 800 million people who use Messenger each month may soon have a machine monitoring their conversation, offering suggestions.

Related Tags