Google announced today that it will be revamping its image search functionality. It is going to make image search capabilities even stronger. It is going to add new features including Google Lens and AI integration into Google image search.
With each passing year, Google has been able to improve its image identifying algorithm. Instead of extracting a written description about the image, Google can identify images just by examining the pixels.
Google’s overall goal is to enhance the visual search capability. The company wants people to get more accurate visual information results, and assist people in work that involves a lot of visual stimuli.
Therefore it is increasing the visual content results it provides, including videos and images. Other examples include: visual recipes, sports clips, and illustrated weather reports.
It wants to go above and beyond this. So, the Silicon Valley giant has come up with four concepts.
The first major update is the development of AMP stories using Artificial Intelligence (AI). If a user searches for a celebrity or a famous person in Google Search, they will be shown AMP stories about the person.
AMP stories will be a brief biography of the celebrity that will include their biodata, their career highlights, and a brief glimpse into their personal life including their marriage etc.
This biography wasn’t written by any human, or someone working at Google, rather curated by AI using sophisticated learning algos.
Featured Videos in Google Search
Videos can be a quick and flexible way to research a topic. Users can dive deep into a video and extract all the relevant information. But the videos have to be related to a user’s search query, otherwise, it will be useless.
With the help of a computer vision, Google will now be able to comprehend, analyze and deeply understand the most useful information and present the most relevant videos in a new option called Featured Videos.
For example, if a person is planning a trip to New York and searches for places to visit in New York. The latest algorithm will be able to understand this command and present videos on famous tourist destinations. Hence a person will be given results at Madison Square Garden, Central Park, and Broadway.
Websites with better content will be given a priority
Google images will now produce results of images that have a good website backing them. This means that websites which are updated frequently will be given priority. Google will also prioritize sites where the image is “central to the page, and higher up on the page.”
So if a person is looking to buy a specific pair of jeans, a product page dedicated to that specific pair will be ranked higher up than let’s say a product category page for jeans.
Google will also show the title of the image page as a caption to make searching easier. It will also give suggestions on related images.
Google Lens will help you explore images within images
Lens’ AI technology examines images and explores sub-images that could be of interest. If you select one of these sub-images, Lens will produce other relevant images. Many of these sub-images will have product pages where you can buy them or search for other related images more extensively.
Lens will also let users “draw” on the images to create unique sub-images which they will be able to further search as well.
Do let us know what you think in the comments section below!