Google Lens: No more “Googling” – Searching with the mobile camera is the new trend

by

Search was first written, then voice and now it’s … visual – “If you can see it, you can search it”

With the new generation of technology based on artificial intelligence (AI), the search for information is visualized and made even simpler, as Google announced on the occasion of the new AI functions added to Google Search, Google Maps and Google Translate.

Over time, the company’s investment in artificial intelligence has increased, and now “we can understand information in its many different forms – from understanding language, to understanding image, to video – and even understanding real world,” said Elizabeth Reid, Vice President & GM Search.

If you can see it, you can look for it

Cameras have become a powerful tool to help explore and understand the world around us.

In fact, Google Lens is now used more than 10 billion times a month as people search for what they see using their camera or photos.

As noted in the coming months there will be the ability to use Lens to perform “screen search” on Android. With this technology, citizens will be able to search for any image or video they see on websites and apps they know, such as messaging and video apps – without having to leave the app environment and interrupt their experience.

An example

For example, let’s say a friend of yours sends you a message containing a video of him exploring Paris. If you want to know more about a monument you spot in the background, you can simply press and hold the power button or the home button on your Android phone (which in turn activates Google Assistant), then tap on the “search screen”.

So Lens recognizes the monument as the Palais du Luxembourg – and you can click to learn more. Or maybe you notice an interesting chair in your friend’s video and want to find out where you can buy it or find other chairs that look like it. Simply press and hold the power or home button on your mobile (which activates Google Assistant) and then tap on the ‘search screen’ to see the available options.

Ways to search with Multisearch

With the multi-search feature, users can search with an image and text at the same time – opening up new ways of expression. Today, multisearch is available globally on mobile, in all languages ​​and in all countries where Lens is available.

Recently, the multi-search feature has been further developed by adding the local search feature. Users can take a photo and add the words “near me” to their search to find what they need. This feature is currently available in English in the US, and will be rolled out globally in the coming months. Also, sometimes users can already be doing a search and find something that catches their eye. In the coming months, they will be able to use the multi-search function globally, for any image they see on their mobile search results page.

Now, Google says, it’s creating search experiences that are simpler and more visualized than ever before. In the future, with the help of artificial intelligence AI, the possibilities will be endless.

Multisearch is available worldwide in all languages. In addition, the local search feature Maps Immersive View (immersive view in Maps) has been added, coming to Venice, Florence, Amsterdam and Dublin. Search with Live View also comes to Barcelona, ​​Madrid and Dublin Indoor Live View now includes 1,000 new airports, train stations and shopping malls as well as new or more EV-related features on Maps – how to find stations fast or regular charging within your route.

RES-EMP

You May Also Like

Recommended for you

Immediate Peak