Technology

Google Lens revamp in development with ‘filters’ for Dining, Shopping, & Translate

At I/O 2018, Google revamped Lens with real-time recognition, smart text selection, and native camera app integration. A year later, another redesign for Google Lens that adds specific recognition “filters” is in development.

In this ‘APK Insight’ post, we’ve decompiled the latest version of an application that Google uploaded to the Play Store. When we decompile these files (called APKs, in the case of Android apps), we’re able to see various lines of code within that hint at possible future features. Keep in mind that Google may or may not ever ship these features, and our interpretation of what they are may be imperfect. We’ll try to enable those that are closer to being finished, however, to show you how they’ll look in the case that they do ship. With that in mind, read on.

For the past several beta releases, the Google app has been adding new features for Lens. Version 9.61 from earlier this month detailed various filters for Lens, including Translate, Dining, and Shopping.

<string name=”translate_filter_name”>Translate</string>

<string name=”dining_filter_name”>Dining</string>

<string name=”shopping_filter_name”>Shopping</string>

With Google app 9.72 this evening, we have enabled the new Google Lens interface. Still in development (and currently lacking any branding), this revamp places five filters at the bottom of the screen. Each filter presumably puts Lens in a precise mode or accesses a specific visual tool.

Google Lens filters revamp
9to5Google

In the middle is a magnifying glass icon that’s likely for general search. To the left is the aforementioned “Translate” filter that will likely focus Lens on translating foreign text into your designated language. From past strings, these translation feature are more advanced than the current implementation that just recognizes text and offers to open the Google Translate app.

The new Lens could “Auto-detect” a language, with users having the option to “Change languages” from the “Source” text to the “Target.”

<string name=”lens_translate_apply_button”>Apply</string>

<string name=”lens_translate_filter_params_auto_language”>Auto-detect</string>

<string name=”lens_translate_filter_params_auto_language_long”>Auto-detect language</string>

<string name=”lens_translate_filter_params_label”>%1$s → %2$s</string>

<string name=”lens_translate_filter_params_title”>Translate options</string>

<string name=”lens_translate_source_label”>Source</string>

<string name=”lens_translate_target_label”>Target</string>

The next tool could be optimized for copying text. Google Lens has long featured smart text selection for quick optical character recognition (OCR), like taking action on addresses, numbers, and copying large swathes of text.

Google Lens
9to5Google

To the right of the main visual search mode — which would still be leveraged for identifying monuments, plants, animals, etc. — are “Shopping” and “Dining” icons. The former presumably works to identify clothing and furniture, with results specifically tuned to product search. AR Shopping is another feature in development.

The Dining filter is more mysterious — as an equivalent feature does not currently exist in Lens, but strings suggest something more akin to Google Maps AR. Users might be able to point their camera at the real world and only see restaurants highlighted in the viewfinder. One string in Google app 9.66 requested permission for location to “Search restaurant nearby” in possible conjunction with this mode.

<string name=”search_nearby”>Search restaurant nearby</string>

<string name=”permission_not_granted”>To see popular dishes, turn on location</string>

The idea of adding filters (or specific modes) to Google Lens might be counterintuitive to the idea of just pointing a camera at the world and getting results. However, this added layer of complexity can provide more refined results. Users are now able to choose what help they receive rather than just have Google attempt to automatically figure it out and return an unrelated result.

Before Google Lens was announced, we spotted it in development as “Visual Search.” One of the final iterations before Lens officially launched was a similar interface to what we enabled today. However, that UI was likely for testing and helped refine the accuracy of Lens by requiring testers to manually specify the search category.

It’s not clear when this Google Lens filters revamp will launch, but I/O is a good bet given that the visual search feature first debuted at the developer conference in 2017, and received a major revamp the year after. Be sure to check out our full APK Insight of Google app 9.72 where we also enabled a Sleep Timer for Google Podcasts and several other features.

Credits:9to5Google


Also Read on TechDomes


About the Author

John Manor

John Manor is an author and an advertiser working for TechDomes bringing you the latest Gaming news and leaks every day. You can message John over email or Instagram.


John Manor’s favorite gear


Advertisements
Advertisements

Categories: Technology

Tagged as: ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.