eBay Visual Search

Shop eBay with pictures. Users discover and research products online through blogs, social networks, and other apps - often largely through pictures.

What is visual search and why?

Use your mobile camera or images from your device's photo library to shop on eBay. Visual search is part of a large effort for eBay with the goal of making shopping simple on the platform. Part of this goal is to attract millennial users to the eBay shopping platform.

My role

I worked as the lead designer on this project and brought it from concept to a working released feature. This project required close collaboration between 2 core search engineering teams, 3 project managers, and the lead designer from eBay's core search platform. It took approximately 9 months to design, test, and release this feature.

Project kickoff

At the start of the project I needed to define goals before starting in on designs. I needed to understand who the target audience would be, how does visual search provide value to customers, and what services would power this feature. (It was determined that we would release this feature on native mobile first.) I also was curious to understand people's current behaviors when shopping. Do people take photos while shopping, do they share images, or do they do research online using photos (I.E., search within Google Images)? Answers to these questions will have a large impact on the design so the first steps toward creating a user story started with a competitive analysis.

Intro research / Competitive analysis

Visual search is a fairly common feature found on most of the top e-commerce shopping apps. I downloaded quite a few shopping and non-shopping apps that utilized some form of image recognition feature. Entry points was a big question for me. I needed to know how users discover the visual search feature.

Here's a small list of apps that have an image search feature.

Visual search entry points

The competitive analysis clearly pointed out that most apps surface a small icon inside their search input as the main visual search entry point. The global search input felt like an appropriate place to surface the entry point since visual search is a subset of the textual search UI.

Camera UI / Experience

Beyond the entry point into the app, I wanted to see what the overall experience was like to enter the feature, take a photo, and view visually similar results. I wanted to learn how accurate were the results that were being returned from an image search.

The competitive analysis showed that most of the apps were able to return a result set that matched closely to the items I took pictures of. I also took special note that I encountered very little friction in the experience of entering the feature, taking a picture, and viewing results.

I observed how the camera components looked as well. I wanted to see if companies were utilizing the standard device camera or did they invest in a custom camera component.

Macy's, Home Depot, Amazon, and a few other apps all used a custom camera UI instead of the standard default device's camera. The custom UI gives the app a feeling of excitement. For 1st time users, this is the wow moment an app needs to encourage engagement. The need for a custom camera is also helpful with providing users with the context that they have entered into a visual search experience (not to be confused with just taking a picture.)

The camera UIs shown above are a few of the best I liked while doing research. Amazon allows users to perform a subset of smaller types of visual searches (scan barcodes and use photos) while inside their camera UI.

Target audience

After completing the competitor analysis I worked with a member of the eBay research team to define the specific target customers of this feature. We created the following buckets to learn what characteristics we wanted to focus on.

  • eBay Buyers (not focused on eBay Sellers for mvp app release)
  • eBay Experience: non-eBay users, moderate users, power users
  • Age Group: Millennials, Young Professionals (Secondary interest in older age groups)
  • Interests: Fashion, Home Decor / Furniture (Secondary interest in Collectibles / Antiques)
  • Other: Tech Savvy, Pinterest / Instagram users

Rough concepts

I felt comfortable with where I was in this phase of the project and started to sketch out a few user flows. I wanted to get a feel for how the user would discover the entry point, enter the feature, snap a photo, and view relevant search results.

Wireframes... time to MOVE FAST!

The team had an extremely short amount of time to get this feature to market. We needed to get designs validated, get a proof of concept feature built for testing backend services, and hash out the general experience of how this feature would return visually similar results based on an image.

After sketching a multitude of flows I narrowed down a few concepts that I could use to start wireframing a basic user flow. These rough wireframes helped communicate a basic high-level idea of how the feature would work.

At the core of the experience, a user would perform the following:

  • 1. Enter the feature from the global search input's entry point from the global search input
  • 2. Agree to the app access dialog (appears during first-time use only)
  • 3. A user takes a picture
  • 4. Image recognition service parses the picture
  • 5. Search service returns visually similar results

Design Iterations / UI Refinements

Once I started to move away from low fidelity designs an onto feasible approaches I began looking at how existing patterns could support this feature. Shoehorning this experience into an existing component set would not be idea but time constraints really forced me to explore this potential situation.

The camera component was an area that I heavily explored during the post wireframing stage. This is the only area of the experience that is new to eBay. With the exception that there is a camera component that is seen when customers list items on the platform for sale.

I worked through several variations of the camera to try and find what would work best for the customer. I also looked for areas that would allow for innovation to be pushed such as auto-detection of items, allowance of barcode scanning (which would work with an existing feature called Red Laser), and a way to allow users to access a help screen.

As I'm frequently meeting with my project managers, engineering team, and design leadership I'm getting closer to locking down the user flow and camera UI. This was not without its fair share of hurdles.

Challenges

1. Custom camera component OR default native camera
Due to the time constraint, we were forced to utilize the existing camera component that exists inside of the user flow for listing items for sale on eBay. We could only perform small modifications to support the visual search use case. The custom camera component was not critical to the experience. However, it is a much cleaner interface that allows the customer to focus on an item much easier than the existing component. A piece of the component that I fought for is the information icon. When a user taps this icon a help screen appears that gives guidance on how to use the feature and receive optimal results.

2. Support for scanning barcodes:
Prior to the development of visual search, eBay built a feature called Red Laser that allowed customers to search for items by scanning a barcode on your phone. The entry point for this feature was also surfaced inside the global search input on native mobile. However, the engagement on this feature was less than 2% which basically meant that not many customers used it.

During the design of visual search, there were discussions from leadership to possible remove Red Laser. I did not want to remove this service as I had an assumption why the engagement was low with Red Laser. My assumption was that the icon used as the entry point was not recognizable enough for customers to tap on it. And even if customers tapped on it, there was not much context how to use it or what it would do.

So, instead of removing this barcode scanning feature, I decided to fold it into the visual search feature. Search with a picture or scan a barcode to retrieve results... why not.

The entry point for performing a visual search invokes an action sheet menu where customers can choose to perform an image search or scan a barcode label. Ideally, the feature would not need an action sheet menu to make a choice. The best experience would be for the feature to be smart enough to recognize an inanimate object from a barcode without having to make a menu selection on what course of action you want to take. The engineering effort to build autodetection would be too costly and time consuming for the initial release.

3. Forcing users to crop their image
After a user takes a picture there is an automatic crop overlay that appears. This is to make users crop their subject matter before tapping the done button to view results.

Why does this happen? For the MVP release, the search science team was concerned that the image recognition service might not be able to properly detect the main subject from background noise. So, to ensure the best quality results from captured images we force the users to crop their images. Ouch! (This is not a permanent solution.)

4. Visual searches on items that are in an unsupported category
Visual search was initially only supported in the Clothing, Shoes, and Accessories categories. This means that if a user performed a visual search on an item that falls outside of the supported categories previously mentioned they would see a null result set. This would severely impact the user's trust of the feature. To combat this potentially disruptive scenario we were able to at least direct the search results to a category that is relevant to the unsupported item.

The user has searched for an item that is listed inside an unsupported category. Instead of showing a null result page, we redirect the user to a category page relevant to the item that was captured.

We surface a call-to-action banner to give context why they are on a category page vs. search results.

User Research

User research was performed using 6 participants. I worked directly with our researcher and helped create a script/criteria for her to use while moderating the test. For this user research session, we focused on:

Target users

  • eBay Buyers
  • eBay Experience: non-eBay users, moderate users, power users
  • Age Group: Millennials, Young Professionals
  • Secondary interest in older age groups
  • Interests: Fashion, Home Decor / Furniture
  • Secondary interest in Collectibles / Antiques
  • Other: Tech Savvy, Pinterest / Instagram users

Research goals

  • Gather data that will help the team map out next steps for the feature
  • Get feedback on the initial concept and design of the image search feature
  • Understand how an image search feature might provide value to the people who use our app
  • Understand people’s current behaviors when shopping, taking photos, capturing images online, and sharing images/links with others, and how these might impact product design for image search
  • Get feedback on competitor image search features

Questions to answer

  • Can the user easily find the new feature i.e. when they open the ebay app, do they know they can now shop using images?
  • Is it easy to submit an Image Search query?
  • Is the user inclined to crop their image? Do they find the process intrusive?
  • Are the visual search results useful?
  • How do users compose their photos? Are the items nicely centered, zoomed in, and in focus?
  • Is the user’s intent different for an image search vs a standard search?
  • Do users see taking a picture of something to find similar items online as a valuable use case? Do they think they would more often take a photo or upload images they find online?
  • Would they use it in place of a text query because the feature is available? (I.e., even for things that are easy to describe.)

The Participants

*To protect the privacy of the participants stock photos used above.

We recruited 6 candidates to participate in this study. There were a mix of eBay and non-eBay users. All of whom are frequent online shoppers that use their mobile phones for shopping.

Learnings

Here are some interesting learnings that we learned from the study.

  • People are already taking a lot of photos in-store for reference, inspiration, and for purchasing later.
  • More likely to take photos and do image search later, versus live image search by taking a photo
  • Use cases
  • Price comparison
  • Aesthetically similar
  • Can’t find a link to buy item
  • Can’t describe the item easily (e.g., Jelly shoes, playing cards)
  • Quality of results (variation in definition but some commonalities)
  • People who cared about aesthetic items were OK with similar items
  • Expected to see exact match but were happy with similar items
  • Commodity items/utilitarian - people want exact match (not targeting this as much right now) (e.g., plates, books)
  • One person expected to find aesthetically/stylistically similar items, but he got the exact chair and liked that
  • Forgiving about results that don’t quite match because they know the technology is complex
  • Filter Support
  • People seem to want some sort of filtering or tags available post search
  • Keywords would be helpful to surface in the global navigation (context awareness).
  • Extend Category Support (for more than Clothes, Shoes, and Accessories)
  • Seem to be more inclined to use it for aesthetic search - home decor
  • Potential for exploration
  • Entry points (search results versus tool tip on search
  • People overall preferred the tool tip way of educating them about the feature
  • Search box a better placement for it - more noticeable
  • Build the guidance into the feature?
  • People had a lot of questions which probably means they are more inclined to read guidance
  • Specific items will need specific guidance, e.g., shoes
  • Tips for better photos as they go along
  • Category-based guidance
  • People who don’t use eBay
  • Didn’t sway them to want to use eBay but they liked it
  • Concept is so new - someone who isn’t an eBay user may not be an early adopter
  • People didn’t know the apps they have had an image search feature

Interesting Discovery

Prior to the user research study, we were doing some intro research on the average customer's current behaviors with shopping/images. People get inspiration from all different places like shopping online or in-store, someone’s house, social media, Buzzfeed, Pinterest, blogs. These are places that are off eBay.

There were a few comments from the participants during the study that validated an idea that we had. We were thinking of how we could enable users to share an external online image with the eBay app to trigger a visual search. I came up with a quick concept flow on how this could potentially work. I was able to build a prototype that could be testing at the end of the main objects of the study.

The goal of this project is to create a frictionless experience from when users find a product they are interested in to actually being able to run image search on an e-commerce site like ebay.

Use Cases (Send external image to visual search)
Users leverage native iOS and Android 'share' functionality in apps, photos, and mobile websites to share content with friends via text message, email social networks etc... Instead of sharing the image to one of these communication platforms, users will be able to 'share' to the ebay app and easily run image search on the image. This is the key difference - using 'share' for sharing into ecommerce rather than via messaging communications.

As a shopper, I’d like to easily run an image search on any image I come across while browsing the web or another app so that I can quickly see the relevant inventory on ebay.

  • Key comments on the share enabled search prototype
  • Take screenshots and share with people via text
  • iOS 11 - building it into the OS; making it easier to share
  • Share flow was useful from outside eBay
  • Positive reaction to being able to do this
  • Need to teach people about features so they know they’re there
  • Need robust product marketing
  • Instagram ad, integration, eBay callouts
  • Share button in Safari browser felt more natural to people

Big Takeaways

Some of the biggest takeaways of this session were unanimously, "Develop this feature now! We need this!!". The participants provided a wealth of feedback that ultimately will help improve a few rough edges in the experience. Ultimately, for MVP release the was nothing terribly negative that emerged from this study.

We've captured a great amount of initial research data to use for improving the next release of Visual Search. In conclusion, it was great to validate that this visual search feature makes good business sense for the company. eBay has a HUGE inventory that the feature can pull from in order to serve up visually similar results.

This feels like a natural progression for searching with pictures when shopping. A picture is worth 1,000 words!!