Science

MIT's A.I. Can Figure Out a Recipe By Looking at a Picture of Food

Instagram food posts might not be so useless anymore.

Flickr / veganLazySmurf

Scientists at MIT fed an intelligent machine one million recipes and 800,000 images of food, giving the program enough culinary-wisdom to deduce a recipe, just by looking at a photo of a snack or meal.

Developed at MIT’s Computer Science and Artificial Intelligence Laboratory, the neural learning program is called Recipe1M, and in a paper published this week, researchers demonstrate how it makes complex judgments about pictures of diverse types of food, such as pizza, soup, and sugar cookies.

The researchers scraped their one million recipes and 800,000 food images from some two dozen cooking websites. When presented with a picture of a food, say a bowl of tomato soup, the machine matched the photo with its deep background knowledge of recipe and image pairings. It suggested the correct ingredients 65 percent of the time and suggested similar recipes.

By giving Recipe1M such a massive infusion of data, the scientists showed that computers — if properly educated — can make accurate suggestions about food, which people can use to understand their eating habits or grasp how to prepare different foods.

“You can imagine people using this to track their daily nutrition, or to photograph their meal at a restaurant and know what’s needed to cook it at home later,” said Christoph Trattner, an assistant professor at MODUL University Vienna in the New Media Technology Department, who was not involved in the study. “The team’s approach works at a similar level to human judgment, which is remarkable.”

Earlier efforts at food-observing AI made some advancements, but nothing that could be used as practically or reliably. In 2014, Swiss researchers developed a mathematical model called “Food-101,” which could identify a type of food correctly about half the time. This model became more accurate with more information, hinting that a broader dataset could yield a machine capable of much more.

“In computer vision, food is mostly neglected because we don’t have the large-scale datasets needed to make predictions,” says Yusuf Ayta, a computer vision researcher and co-author of the study.

But Recipe1M, with its unprecedented dataset, made accurate predictions on nearly seven of ten tries. It identified simple ingredients in desserts best, like flour, eggs, and butter, and then suggested recipes based upon the similar images it had stored away in its digital mind.

The AI struggled with more “ambiguous” foods, such as sushi and smoothies. In the machine’s defense, it seems likely humans would find deducing the ingredients in such blended concoctions more difficult, too.

Already, Recipe1M might be revealing things about our not-so-healthy eating habits. The number one ingredient identified in the images was salt, followed by butter. So if one is looking to reduce the amount of these ingredients in their food — or assess how much they’re gobbling down — AI technology like Recipe1M could help reveal the makeup of their meals.

Some Instragram users are notorious for posting somewhat obnoxious images of food, which, it could be argued, provides little to no societal benefit. These images, say the Recipe1M makers, could now be of use. “But seemingly useless photos on social media can actually provide valuable insight into health habits and dietary preferences,” said Aytar.

Related Tags