Google announced its AI model for multimodal search called MUM (Multitask Unified Model) at the developer conference Google IO in May. Adding to this Google last night announced a bunch of consumer-centric features which include visual search to be released for users in the coming months.
Currently, Google serves contextual information such as Wikipedia snippets, lyrics, or recipe videos to you based on your search phrase. The latest development aims at getting you results by understanding context beyond just the phrase you’ve used.
The latest visual search feature was announced last night. To understand how it works, suppose you are looking at a picture of a shirt on your phone. You can search the lens icon and ask Google to search for the pattern but on a pair of socks or a bag. If the search matches it will add a bunch of new pop culture t-shirts to my collection.
The lens can also be used to take a picture of a broken bike part or a piece of plumbing in your house. Simply search for ‘how to fix’ to get video guides on those topics. The feature comes in handy when you don’t know the name of the part, which could make the textual search very difficult.
Google is also working on the “Things to know” section. It will help to show more context for the phrases you search for. For example, if you search for “acrylic painting,” the result will show you something like “How to start with acrylic painting” and “What household items to use” for the activity. Google’s attempt to read beyond just the search phrase.You will not view this section while you’re searching for certain unspecified sensitive terms, though Google is not putting limits on any topics.
Google’s New AI Model will also help you zoom in and out of search results in the coming months, as it refines and broadens result features.
Additionally, Google will show a visual results page for you to scroll through if you are looking for inspiration through the “pour painting ideas” phrase.
The new AI will not only contextually superpower the search results but using the MUM feature to plush out related topics to a video even if the title of the clip doesn’t mention certain topics.
The new AI features some characteristics like extended usage of Google Lens, using machine learning to find correlations between words, and helping you find more contextual information on a topic.
According to The Verge’s Deiter Bohn noted, Google has to answer questions about bias in machine learning data, and its treatment of researchers who might not share the company’s vision in all areas.
Google may try to avoid these questions, though be rest assured we will see more AI being integrated into search and other products in the near future.
Material You Theme may be available for non Google Pixel Phones
Google Will Auto-Reset Unused Android App Permissions for Billions of Devices
Google Warns Users How Hackers Can Make Malware Undetectable on Windows