One of the big announcements of the first day of the annual Google I / O 2017 conference was the new Google Lens, a software that promises to completely change the way smartphone cameras interact with diverse situations in everyday users. The feature is quite promising and will work in conjunction with Google Assistant, analyzing and etecting everything that is “seen” in the real world by the optical sensors of the devices.
During the keynote dedicated to Lens, the company’s CEO, Sundar Pichai, demonstrated how the tool works in practice, being able to identify and interact with just about anything around it. Want some practical examples? Come on, point the camera from your smartphone to a restaurant and the Google personal assistant will present you with various information about the place. Want to know which plant in your yard? Just use Lens to get Google Assistant to get the information for you.
Now comes one of the coolest features that caught the attention of the most attentive. Want to connect to a Wi-Fi network? Just direct the camera from your gadget to the router that the software is in charge of setting the connection on a good. These are just a few practical examples for using Google Lens in your day to day, undoubtedly a feature that tends to evolve a lot over time.
Check out a brief presentation of Lens in practice:
— Google (@Google) May 17, 2017
According to estimates by the Mountain View company, Lens comes first for apps like Google Assistant and Google Photos. However, there is still no specific date for the novelty to begin to reach consumers end. It remains to be seen whether the novelty will be restricted only to the devices of the Pixel line or if the appeal reaches everyone through the Google Play Store.
So what do I think of this partnership between Google Lens and Assistant? Tell us in the comments.