Some New Google Lens Features Announced Today at Google I/O


Much like the Google Assistant, Google Lens was released as a Pixel exclusive and later made its way to other compatible devices. As of today, it'll be rolled out to the camera app of several devices from manufacturers such as  LG, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus, and more. The concept behind Lens is simple; you click a picture, Lens will try and tell you more about it.  The initial results were lacklustre, and often hilarious, but today at Google I/O, some new exciting features were announced. Let's take a look at what they are:

Smart Text selection

lens_menu_050718 (1).gif

Google Lens Lets You Solve Math Problems Using Your Phone’s Camera Just Like Magic

Smart text selection allows for identification of text in an image. Additionally, you can even copy and paste text from images to your phone. Lens also helps you make sense of a page of words by showing you relevant information and photos. Say you’re at a restaurant and see the name of a dish you don’t recognize—Lens will show you a picture to give you a better idea. A real-time translation feature (like the one we've seen in the Huawei Mate 10) is also in the works

Style Match with Lens


Style Match with Lens shows you objects similar to the one you've pointed your camera at. Let's say a piece of clothing piques your interest and you want to know more about it. Google Lens will get you the info on that specific item, and even show you things in a similar style that fit the look you like.

Real-time Lens integration with the camera app

lens_multielements_050718 (1).gif

On supported devices, Lens now works on a real-time basis with the camera app. Users no longer have to specifically select an item and wait for results to load. On-device machine learning and a cloud component allow users to get immediate results as they move their camera app.