Click to Skip Ad
Closing in...

Google Lens will change the way your smartphone camera sees the world

Google Lens revealed

On Wednesday morning in Mountain View, California, Google has taken the stage for its annual I/O developers conference. The first big announcement of the day comes in the form of Google Lens: a “set of vision-based computing capabilities” that will give Google Assistant the ability to analyze and detect the things that it sees through your smartphone’s camera.

During the keynote address at Google I/O, CEO Sundar Pichai demonstrated how the new feature could be used to identify almost anything. Point the camera at a restaurant and Google Assistant will tell you information about that restaurant. Point the camera at a flower and the Assistant will identify it for you. Point the camera at the Wi-Fi details on the bottom of a router and you’ll be automatically connected to the network. These are just a few of the ways that Google Lens upgrades the Assistant.

As The Verge points out, this isn’t the first time that Google has added exciting new functionality to smartphone cameras. With the Google Translate app, users are able to point their smartphone cameras at text in different languages and see a translation on their screen almost instantaneously. This appears to be a similar technology, but with infinitely more possibilities than translating road signs.

According to Google, the Lens feature will be available first for Google Assistant and Photos, but has yet to announce when the feature will actually be available for consumers to try out.

Jacob Siegal is Associate Editor at BGR, having joined the news team in 2013. He has over a decade of professional writing and editing experience, and helps to lead our technology and entertainment product launch and movie release coverage.