Our last post talked about our qualitative analysis of the user tests on Yelp’s website. We found 7 key insights, such as learning that the Events tab wasn’t very helpful and that the filters could use improvement.
In this post, we’ll take a closer look at quantitative analysis on the Yelp site by discussing the insights from card sorting and click testing. To test the Yelp site, we ran a closed card sort and first click test.
Just like our UserTesting studies, the four usability tests we ran were all remote. Unlike our previous tests, we recruited at least 35 people (instead of 5) since the tests were quantitative and therefore statistical significance is required.
Closed card sort
Since we discovered previously that Yelp’s filters needed improvement, we gave users 47 cards (each representing a feature search filter) and asked them to sort them into 4 categories (Very important, Somewhat important, Not important, I don’t know what this means). This would help us understand which filters should be prioritized in the redesign. Closed card sorting is very straightforward since the categories are already decided, which helps decrease abandonment rate and results in authentic data since we look at raw decision-making.
The most popular feature search filters are widely applicable (such as “Best Match”) while the least important filters are based more on individual preferences (such as “Coat Check”).
1. The most important feature filters were “Open Now”, “Accepts Credit Cards”, and “Serves Dinner”.
90% considered “Open now” to be very important. In second place, 85% of people placed “Accepts credit cards” into “Very Important”. When we take a look at Yelp’s search filters, we see that there are five categories people can search from (Sort By, Areas, Distance, Price, Features). Luckily, the feature filter already prioritizes what users confirmed.
However, Yelp could make it easier to filter for meal types. Right now, Yelp allows you to filter by breakfast, brunch, lunch, dinner, and dessert. While 82% of users said “Serves dinner” was very important and 55% of users said “Serves lunch” was very important, the interface doesn’t prioritize that information. It actually takes a couple clicks to access the feature filters for meals.
As you see below, there’s definitely some visual clutter, so we could definitely simplify this layout.
2. More than 88% of users thought that 7 feature filters were “Not important at all”.
We also needed to consider the other side of usability by checking which feature filters could be deprioritized or perhaps removed entirely. That’s why we also paid close attention to which cards were placed in the “Not important at all” category. We found that 92% of people thought that “Dogs allowed”, “Has karaoke” and “Has a DJ” weren’t important at all.
You can see the full results below:
This leads us to three considerations for the redesign:
- Make the three most popular feature filters more accessible
- Delete the least popular search filters for a cleaner interface
- Re-prioritize the IA so there is some hierarchy to search filters
Continuing our tests on the Yelp site, we also ran a first-click test. A first click test helps determine whether the labels and designs make it easy for users to complete a given task.
1. People found the site somewhat confusing
Because the click test was quick, we had some time to ask some detailed questions:
- What do you like about the Yelp website?
- What did you not like about the Yelp website?
Most people liked how the search function was very easy and that they found the reviews helpful. But 30% of people said the site felt difficult to navigate.
Here’s a snippet of what they said:
“The site feels busy and a bit dated”
“The site is a bit cluttered…a lot of it is very useful but it can feel overwhelming the first time you use it.”
You’ll notice that the feedback included language like “at first”, and “a bit,” which indicates that even though people struggled initially, they didn’t find it to be a huge roadblock.
2. Heatmaps showed a wide range of clicking areas
Interestingly enough, no more than half the people clicked in the same place for each task. We’ll describe one task as an example of how the clicks were distributed.
We asked users to find a reasonably priced restaurant for a group of 20. Here’s some details of their first click:
- 55% clicked “Restaurants” in the upper right hand corner
- 45% clicked in many other areas
- 16% clicked “Food”
3. The Search bar was still the preferred backup option
Just like we had seen in the UserTesting studies, our click test also showed that people relied on the search bar when things weren’t clear. When we asked users to find a mechanic in their area, most people used the vertical menu of service categories on the left. Following in second, people defaulted to search. Here’s the full results:
- 53% clicked on the menu item ‘Automotive’
- 24% clicked on the search bar for this task, which suggests that the site structure wasn’t clear enough.
- 18% clicked on ‘Local Services’
This tells us that the IA can still be improved, and that we may not want to tweak the search bar (since it seems to be such a popular backup option).
4. Users are still confused by the “Events” tabs
We wanted to confirm UserTesting’s findings, so we asked users to try to find more information about a nearby upcoming event. This time, we showed the accounts view (which includes an extra Events tab) to see if this might affect task completion.
Here’s the results:
- 37% ignored both Events tabs
- 37% clicked on the Events tab in the submenu
- 18% clicked on Events on the primary top navigation
This leads us two possible design tweaks: we could either make the Events tab clearer on the main navigation (so users find it quicker), or we could clearly call out where each Event tab takes people. Currently, there’s no indication that the top Events tab takes you to a “main” events page, while the bottom tab takes you to the events you already want to attend. It could be as simple as revising the bottom tab from “Events” to “Your Events”.
Quantitative & Qualitative Dimensions
As you’ve seen above, our testing helped to complement the insights that were uncovered during the qualitative tests. When you combine quantitative analysis and qualitative analysis, you get a clearer idea of why and how to fix problems, as well as how many usability issues need to be solved.
To learn about our whole process from kickoff to redesign, click below to sign up for an early copy of the upcoming e-book.