Users can see if a restaurant has a vibe they're looking for or a specific type of pastry.
Google Search and Maps will use machine learning to become more food-friendly helping people search for specific dishes or vibes, the company said at its Search On event on Wednesday.
When searching for shumai, shawarma or shakshuka in either the Google Search or Google Maps app, people can scroll down to find additional filters. These filters can include things such as vegan or spicy, eliminating the need to pull up a specific restaurant’s menu. Other dietary filters will be coming, but Google will start with vegetarian and vegan first.
Google says it’s using machine learning to find a restaurant’s vibe or what makes it special. A result might say a specific restaurant is “like walking into a friend’s living room.”
Google primarily makes money from ad sales, which is linked to Search, the company’s most popular product. With the rising popularity of TikTok, the short-form video app gives people quick glimpses into what an experience might actually be like. It’s why among younger groups, TikTok has become a search engine in itself. Last year, TikTok dethroned Google as the year’s most popular domain. It’s something Google is paying attention to and is why the company launched YouTube Shorts in 2020, its own short-form video competitor. Google will also begin integrating more imagery and video in search results.
Multisearch, the ability to search with images and text simultaneously, will also be extended for food. For example, a person can take a photo of a pastry they’re not aware of, and Google will find what it is. A person can then add “near me” and Maps will be able to find bakeries that carry it.
Google says it’s also tackling the fragmented menu problem when searching for what to eat. Sometimes menus are photos other people have taken, other times it’s a downloadable PDF. Google will use its Multitask Unified Model AI to create easy-to-read digital menus with up-to-date information.