After you finish eating Lens can help you with the bill too. It's another to see it up close in relation to the things around you.
This means when Android Go users point their camera at text, Lens can now read it out loud.
There is no word yet on whether the company plans to add this feature into text ads - but those searching for more information about lions and tigers, for example, will get an option to view the object as a 3D image in the Knowledge Panel, which serves up to the right of the search query with assistance from AR.
But AR is used for other things. Or if you see a dish you'd like to cook in an upcoming issue of Bon Appetit magazine, you'll be able to point your camera at a recipe and have the page come to life and show you exactly how to make it. Google is also working with partners like NASA, New Balance, Samsung, Target, Visible Body, Volvo and Wayfair to add more AR content to search. Google Maps already offers personalized restaurant recommendations, even serving up a score of how likely you are to like any given restaurant in the Explore tab. It then uses that information to highlight the most popular dishes right on the screen in real time.
In a short video played at the event in San Francisco, Urmila, who can not read, used Google Lens on her smartphone to understand words written in Hindi by letting the app read those words out loud.