Click to Skip Ad
Closing in...

Google Lens multisearch lets you search with text and images at the same time

Updated Dec 20th, 2022 4:39PM EST
Google's new multisearch feature.
Image: Google

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

There are few tools in the modern era more vital than search engines. They often serve as a gateway to the rest of the internet, whether you need to buy new clothes, make new friends, start a business, or schedule a doctor’s appointment. But the search engines are not perfect. They can’t answer a question you don’t know how to ask. In other words, Google can’t read your mind, but a new feature called multisearch might be the next best thing.

Google introduces multisearch feature

As Google noted in a blog post on Thursday, users can’t always find the words they need to find precisely what they are looking for while searching. With multisearch, they can use text and images to show Google Search what they want rather than trying to tell it.

In order to use multisearch, you’ll need the Google app, which is available for free on iOS and Android. After opening the app, tap on the Lens camera icon. From here, you can either choose an image from your personal photo library or just take a photo of something nearby. After selecting the photo that you want to use, swipe up and tap the “+ Add to your search” button. This will let you add text to give Google the context it needs.

Here are some examples of how multisearch can outperform a standard Google search:

  • Screenshot a stylish orange dress and add the query “green” to find it in another color
  • Snap a photo of your dining set and add the query “coffee table” to find a matching table
  • Take a picture of your rosemary plant and add the query “care instructions”

How does the feature work?

In the blog post on Thursday, Google Search product manager Belinda Zeng provided this short explanation for how multisearch works:

All this is made possible by our latest advancements in artificial intelligence, which is making it easier to understand the world around you in more natural and intuitive ways. We’re also exploring ways in which this feature might be enhanced by MUM – our latest AI model in Search – to improve results for all the questions you could imagine asking.

Google says that the multisearch feature is now available as a beta feature in English in the US. You can try it out today, but you’ll need the Google app.


More Pixel 6 coverage: For more Pixel 6 news, be sure to check out our guide.

Jacob Siegal
Jacob Siegal Associate Editor

Jacob Siegal is Associate Editor at BGR, having joined the news team in 2013. He has over a decade of professional writing and editing experience, and helps to lead our technology and entertainment product launch and movie release coverage.