Visual search: an ecommerce revolution

Discover the visual search advancements in machine learning.

Published by 

Alexis Pratsides


Case study featured image for

Did you know that 90% of information transmitted to our brains is visual? How about our ability to identify images which were viewed for as little as 13 milliseconds? Or that research conducted by 3M concluded that we’re able to process information 60,000 times faster than text-based information indicating that we’re programmed to view things visually.

With this information, it isn’t surprising that the way we search for stuff online is changing. With advancements in technology we are no longer participating in text-based searches, instead, we now search the Internet via voice search. Additionally, the way we search is undergoing further changes as popularity of visual search is rising. This makes sense considering we use our eyes to see things every day, so why not take advantage of this natural ability. As such a picture really is worth 1,000 words.

What is Visual Search?

Visual search uses a photo as the query instead of text or voice allowing users to discover information or similar products to your submitted photo. It uses machine learning and neural networks to fulfil searches by uploading or snapping a photo. Searches analyse similar characteristics within the photo finding related images based on these. Similar features include colour, material, and patterns.

Machine learning provides similar results based on pixel by pixel image searches. Additional, results are also based on the metadata and keywords attached to the image.

With increasing people participating in visual search, machine learning will improve in providing highly related image results, further changing the way we search. This is because visual search will become more accurate and efficient.

Big Players in Visual Search

Major players of visual search are Pinterest, eBay, Google Lens, Bing, and retailers such as ASOS, Neiman Marcus and Target.


Pinterest is easily the most well-known visual search engine with 600+ million searches each month. The Pinterest tool allows users to take a photo and returns other photos (known as pins) that are similar to the submitted photo. Returns are based on image-based keywords, indicating that keywords are still relevant.



They offer Shop the Look allowing users to tap on blue dots on an image to search for similar results. Also available, is Pinterest Lens which allows for real-time image search and this is embedded into the Pinterest app.

Google Lens

Google Lens is the successor of Google’s earlier visual search engine; Google Goggles, and is Pinterest’s biggest contender in the visual search engine market. This visual search tool analyses a range of photos from buildings to flowers and much more! An example, of Google Lens ability to return information, is shown below as it identifies the breed of a dog.



Take a look at how Google Lens works in a number of different ways and how accurate the results can be.


Snapchat have also made use of visual search by partnering with Amazon. Users point their camera at a product and hold the screen until an Amazon product card pops up. This takes the user to the product results on Amazon indicated in the image below. As such, this makes it a visual search tool for both Snapchat and Amazon.




Online fashion retailer, ASOS launched their own visual search tool; Style Match. Through their app customers click on the camera icon which prompts them to take a photo or upload one. After a photo has been submitted search results are provided for similar products to the photo. The process is shown in the photo below and results shown are highly accurate. This tool is extremely useful for ASOS and its customers as it drastically narrows down your choices.




Target incorporated Pinterest Lens into their app through a partnership with the visual search giant. This allows customers to search for items by taking or uploading a photo to the Target app.

Product information is provided if the product itself if it’s from Target or similar products if the uploaded product was purchased elsewhere.

visual_search_on pinterest for target


Benefits of Visual Search

Visual search will not only benefit the consumer, but also e-commerce in particular fashion and homeware.

Visual search will be good for users who can’t verbalise questions. For example, if you want to know more information about a landmark, but don’t know the name of it so cannot Google it. This is where visual search comes in as you’ll be able to search the landmark and information about it will come up.

Additionally, visual search will help with the ‘Discovery Problem’, which improves user experience. This arises because retailers have too many options to choose and scrolling endlessly through the items, which results in consumers abandoning their shopping. Visual search solves this issue by narrowing down results instantly making it more user friendly. Consequently, visual search helps consumers find what they’re looking for more effectively and reduces the number of consumers who stop shopping as a result of too many options. Therefore, visual search makes purchasing much easier as it saves consumers time and removes irrelevant items.

Furthermore, visual search will provide retailers the opportunity to cross-sell. This involves other items that feature in the image being suggested to the consumer. This might tempt consumers to purchase these items, which will increase their overall spend on the site.

What Does This Mean for Text-based Searches?

Although visual search provides many benefits to both consumers and retailers, it is important to note that this method of searching won’t replace traditional text-based searches or make the use of keywords obsolete. Rather, visual search will work in tandem with text-based search to effectively provide results related to the image. Therefore, text-based searches will continue to be important.

The Future of Visual Search

Although major players have made great strides in visual search, there is still a lot to do. As the technology continues to advance, it is likely that the popularity of visual search will increase, which will improve the accuracy and efficiency of search results. As more people search visually, machine learning and neural network will become better at identifying similarities in photos, thus providing better results.

In fact, the popularity of visual search on Pinterest has already increased in the space of a year. As shown in the figure below, visual searches increased from 250 million in February 2017 to 600 million in February 2018 – increasing by 140% in a year!

survey on visual search

ViSenze found that 62% of millennials would like to be able to search for products visually indicating that the popularity of visual search lies with millennials and Gen Z.

New technologies survey


How to Optimise for Visual Search

Kissmetrics concluded that for 93% of consumers visuals are the deciding factor for purchasing decisions, thus highlighting the importance of optimising your images. Below are our top tips to optimise for visual search.

  • Ensure your images are displayed clearly without clutter for easy processing
  • Add descriptive alt text to images – including image name, image ID and relevant keywords
  • Submit images to a sitemap
  • Using relevant keywords optimise image titles and alternative attributes
  • Images should be mobile and desktop friendly
  • Use ideal image size and file type
  • Set up image badges
  • Conduct structured data tests
  • Mark-up for images, content pages and product information

Concluding Remarks

It is extremely important that you optimise your images now before visual search properly takes off in order to take advantage of its benefits, otherwise not optimising your images now will hinder your business when visual search becomes the norm.

If you’re looking for help optimising your product images from the heart of the capital, our ecommerce seo London team can get things rolling. Contact us today at MintTwist to find out more about how we can help you achieve meaningful results.

Created by

Alexis Pratsides

Published at 

More insights from the team