HomeBRAND REVIEWGADGETSGoogle search: Google introduces new features such as multi-search, Lens AR translate,...

Google search: Google introduces new features such as multi-search, Lens AR translate, etc.

Google search: At its ‘Search On’ event, Google introduced a plethora of new capabilities for its search engine, many of which would make results richer and more visually appealing. In order to create search experiences that function more like our minds—that are as multidimensional as people—we are moving well beyond the search box. You’ll be able to locate exactly what you’re looking for as we move into this new era of search by fusing images, sounds, text, and speech. According to Google SVP of Search Prabhakar Raghavan, “We call this making Search more natural and intuitive.”

Multisearch feature:

First, Google is bringing the multisearch feature, which it first debuted in beta in April of this year, to English globally. Over the course of the next few months, it will also be available in 70 additional languages. By fusing text and visuals, the multisearch capability allowed users to look up many items at once. The function is also compatible with Google Lens. Google claims that customers use its Lens feature to search for what they see nearly eight billion times per month.

However, users will be able to take a picture of an object and then use the term “near me” to find it close by thanks to the integration of Lens and Multisearch. This “new method of searching will help people find and engage with local companies,” according to Google. Later this fall, the “Multisearch near me” will begin to roll out in the US in English

“A thorough understanding of the product inventories and geographical locations makes this possible. Multisearch and Lens, according to Raghavan, were “inspired by the millions of photos and reviews on the web.”

Updates in translation

Google is enhancing the way translations appear over images. Over 1 billion times per month, consumers use Google to translate text on photos into more than 100 different languages, claims the business. Google will be able to “blend translated text into complicated pictures, so it looks and feels lot more natural” thanks to the new technology. As a result, rather than striking out from the original image, the translated text will appear more seamless and integrated. To guarantee this experience, Google says it is utilising “generative adversarial networks (also known as GAN models), which helps fuel the technology powering Magic Eraser on Pixel.” Later this year, this feature will become available.

Additionally, it is updating its iOS app so that users can access shortcuts just beneath the search box. Users will be able to search music, translate any text using their camera, and more thanks to this.

Google search updates

When people search for information about a location or subject, the search results on Google Search will also become more visually appealing. In the example Google provided, the initial set of results for a search for a city in Mexico also included videos, photographs, and other content related to the location. According to Google, this will prevent users from having to open numerous tabs in order to learn more about a location or a subject.

Even when a user starts to type a question, it will also offer more pertinent information in the upcoming month. In order to assist users in formulating their queries, Google will offer “keyword or topic possibilities.” For some of these themes, like cities, etc., it will also feature content from creators on the open web, along with travel advice, etc. According to the company’s blog post, the “most relevant content, from a number of sources, regardless of the format the information arrives in — whether that’s text, images, or video” will be displayed. The launch of the new functionality is scheduled for the upcoming months.

Menu information while searching for food

When looking for food—whether it is a specific dish or an item at a restaurant—Google will display visually richer results, including images of the sought-after dish. Additionally, it is “covering additional ground for digital menus and enhancing their visual quality and dependability.”

Also read: Honda Motor: The best and most affordable bike in its class with extraordinary feature-set, details here

The company claims that in order to produce these novel results, it combines “menu information provided by people and merchants, and found on restaurant websites that use open standards for data sharing.” It also relies on its “image and language understanding technologies, including the Multitask Unified Model.”

Google stated in a blog post that the menus “will highlight the most well-liked foods and conveniently flag out different dietary options, starting with vegetarian and vegan.”

Additionally, it will change how shopping results show up on Search, making them more visual with links and allowing users to shop for a “full look”. The user will be able to view these specific things in 3D view from the search results, which will also offer 3D shopping for sneakers.

Google maps

Though the most of these will only be available in a few areas, Google Maps is also getting some new features that will provide more visual information. Users will be able to, among other things, determine where to eat, go sightseeing, etc. in a specific area by checking the “neighbourhood vibe.”

Tourists will find this interesting because they can utilise the information to get to know a neighbourhood better. Google claims that in order to provide this information, it combines “AI with local knowledge from Google Maps users.” In the upcoming months, Neighbourhood Vibe will launch on Android and iOS worldwide.

Expansion of Maps

Additionally, the immersive view function is being expanded to provide users 250 photorealistic aerial views of famous places throughout the world, ranging from the Acropolis to the Tokyo Tower. In Google’s blog post, the company said that “predictive modelling” is the method used for immersive view to automatically understand previous trends for a location. In the upcoming months, the immersive view will be made available for Android and iOS in Los Angeles, London, New York, San Francisco, and Tokyo.

The Live View functionality will also allow users to view useful data. Users who are out and about can utilise the search with Live View tool to locate a market or store nearby. In the upcoming months, Search with Live View will be made available on Android and iOS in London, Los Angeles, New York, San Francisco, Paris, and Tokyo.

The Google Maps Platform also enables third-party applications to utilise the ecologically friendly navigation function that was previously available in the US, Canada, and Europe. Google believes businesses in other sectors, including delivery or ride-hailing services, will give customers the choice to choose environmentally friendly routing in their apps and track fuel usage.

Keep watching our YouTube Channel ‘DNP INDIA’. Also, please subscribe and follow us on FACEBOOKINSTAGRAM, and TWITTER.

Enter Your Email To get daily Newsletter in your inbox

- Advertisement -

Latest Post

Latest News

- Advertisement -