Google is officially rolling out its Lens feature to Apple iOS devices over the next few weeks. Lens is an on-demand object recognition tool, accessible through the Google Photos app. When users take a photo of a book or painting, for example, Lens recognizes it and spits out information telling you more details about the object.
Google Lens is Google’s AI image recognition technology at its best. Just point the camera at different things with Google Lens turned on, and you’ll get additional information about the things you’re looking at, as well as advanced integration with other apps.
It first launched on Google’s Pixel phones at the tail end of 2017, before being launched for all Android phones in March 2018. As of today, March 16, iOS users can also access the deep learning of the Google Lens by accessing it through the iOS Google Photos app.
If you’re looking at landmarks or photos in a museum through Lens, Google will tell you what it knows about it. That’s pretty neat if you want some quick information about the things around you.
Google Lens is a feature powered by the company’s AI smarts that lets users scan business cards to create a contact, or view landmarks and book covers to receive more information on those items.
Because this is a Google product, Google Lens does collect data about your activity. It’s no surprise there. However, you can delete your Lens history from your Google account, just like you’d do with everything else the company collects about you.