Google Lens comes to the Pixel 3 camera, can identify products – TechCrunch

Google Lens, the technology that combines the smartphone camera’s ability to see the world around you with A.I. technology, is coming to the Pixel 3 camera, Google announced this morning. That means you’ll be able to point your phone’s camera at something – like a movie poster to get local theater listings, or even look up at actor’s bio, or to translate a sign in another language – and see results right in the camera app itself.

The integration is thanks to Google’s investment in A.I. technologies, something that was the underlying tie to everything Google announced today at its hardware event.

Lens, in particular, was first shown off at Google I/O back in 2017, before rolling out to new products like Google Image Search just weeks ago. Now, it’s in the camera itself – the most obvious place for the technology.

With Lens, you can point your camera at a takeout menu, Google says, and it will highlight the number to call.

Another feature is centered around shopping. With a long press, you can have Lens identify a product the camera sees in the viewfinder, and have it match it to real products. This is called “Style Search,” Google says.

As Google explained at the event, you can point your Pixel 3 camera at a friend’s cool new pair of sunglasses or some shoes you like in a magazine, and Lens will point you to where you can find them online and browse similar styles. The feature is similar to Pinterest’s visual search, which has been available for some time.

Style Search has been available in Lens as of earlier this year, but Google took the time to call it out today at the event. And Lens itself has been inside the camera apps of older Pixel devices as well as those from other manufacturers, including LG, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, and Asus.

Also of note, Lens will be able to take some of its more common actions instantly in the camera, without the need for a data connection.

Google says this is possible by combining Pixel’s visual core with its years of work in search and computer vision.

“Being able to search the world around you is the next logical step and organizing the world’s information and making it more useful for people,” said Brian Rakowski, VP Product Management at Google.

Be the first to comment

Leave a Reply

Your email address will not be published.


*