Google may have teased us with exciting new AR features for the Maps app, but it’s not forgetting to make its Lens camera more useful, either. Since its launch last year, Lens has rolled out to iOS and gained a few skills, like identifying cat and dog breeds. At its I/O developer conference today, Google announced three new features for Lens — Smart Text Selection, Style Match and Real-time results. After checking it out here at the show, I’m most intrigued by the text-recognition tools, which actually seem useful.
What Smart Text Selection does is scan words in documents around you and let you interact with them in your phone. For example, point your camera at a menu, tap on it and see explanations for dishes you aren’t familiar with. During our demo, a Google rep used Lens to snap a picture of a book’s inside cover, and highlighted the text in the photo. Just like you’d expect with words in a messaging or editing app, a menu popped up to let you copy your selection to paste elsewhere. I was already impressed by the fact that this even worked, but was even more pleased when I saw that the words in the picture didn’t have to be perfectly aligned for Lens to recognize them. (126) add
Style Match is a more familiar tool that’s similar to Amazon’s Firefly, or AR View, or any number of similar visual shopping features. Scan a blouse or a quirky lamp that you like, and a blue dot appears on top of it. Tap that, and Lens will display a list of items it’s found that are similar to it in style. A couple times during our demo, Lens even found the exact clothes or shoes we scanned, and showed them as the first in a list of other similar products. Lens didn’t simply return clones of the floral shirts and brown shoes that Google placed in its booth, either. There was a nice variety of stuff that resembled the original, but differed in small ways. This is great for people who are looking for more options but don’t want to buy, say, five of the same shirt. (147) add
Finally, there’s “real-time results,” which isn’t so much a new feature as it is a general interface improvement. Instead of having to press a shutter button and wait for Google Lens to process the image, the app will be able to overlay information on objects in the viewfinder in real time. So as a Google rep swept his camera across the room, I saw dots bubble up on the screen, indicating that Lens was finding items. In a couple of seconds, blue tags appeared on items like a blouse or a book, meaning more information for those things was available. On the LG G7 that our rep used for the demo, this all happened quickly, with only minor noticeable delays as Lens analyzed the scene. (125) add
Speaking of, the G7 is one of the first phones to get Google Lens built into its native camera app, alongside the Pixel phones. More handsets from brands like Sony will get this baked in later this year as well. That’s a nice convenience and saves users the hassle of having to install a new app that would take up space in an apps list.
All told, I’m most excited by the coming text integration feature. It seems to be the most useful, and I can see myself taking photos of leaflets, poems, press releases, specs sheets or contracts and pasting the words into my note-taking apps. We’ll have to wait till the updates roll out “in the next few weeks” to know how effective they’ll be outside of a demo situation, but until then I’m cautiously hopeful that the new features will make for a much more useful Lens.
Click here to read all the news from Google I/O 2018!
Cherlynn is reviews editor of Engadget. She led a mostly unexciting life in Singapore, her home country, until she came to New York in 2012. Since then, she’s earned her master’s in journalism from Columbia University’s Graduate School of Journalism and covered smartphones and wearables for Laptop Mag and Tom’s Guide. Life is now like a Hollywood movie, with almost as many lights and much more Instagram. And also more selfies.
They still need one more Senator on board to overturn the FCC’s decision.