User experience is subjective by nature.
We can measure its effects on product engagement (e.g. pageviews, time on site, etc.) and conversions (signups and sales). But how do we directly measure and benchmark UX?
According to a recent case study performed by Google employees: Kerry Rodden, Hilary Hutchinson, and Xin Fu, user-centered metrics are hard to come by. Either the parameters are too generic to be useful, or too specific to be applicable across the board.
In their paper, they propose an idea for a set of genuinely useful UX analytics. Let’s explore the UX metrics every team should be tracking.
Low Level, “PULSE” Metrics
The old paradigm of analytics is geared more towards measuring progress against business goals. While still useful, they’re lagging indicators of UX decisions.
Common metrics include:
- Pageviews- Number of pages viewed by a single user.
- Uptime- The percentage of time the website or application is accessible to users.
- Latency- The amount of time it takes data to travel from one location to another.
- Seven-day active users- The amount of unique users to interact with the site or app within the last seven days.
- Earnings- Revenue generated by the site or product.
While businesses should still track these metrics, they should remember that they lack context for measuring UX. For example, an average time on site of 5 minutes might mean users are highly engaged – or just aren’t finding the content they need.
For meaningful context, try examining HEART metrics.
Join the world's best designers who use UXPin.Sign up for a free trial.Try it for free!
High Level “HEART” Metrics
HEART metrics are a more comprehensive framework for UX metrics.
HEART metrics are as follows:
- Happiness- The overall attitudinal satisfaction or delight that a user feels in relation to your product. UXPin uses net promoter score (NPS) as a Happiness metric.
- Engagement- The degree to which a user responds to, or is involved with, the product. Examples include: visits per week, actions per visit, etc. UXPin uses Kiss Metrics to track custom events triggered by different features (e.g. “Created prototype in first day of trial”).
- Adoption- The number of new users (or uses of a feature) within a set amount of time. After UXPin released its custom libraries feature, the product team measured use over a 6 month time period .
- Retention- The number of users that continue using the product within a certain time frame. The product team and customer success team at UXPin both track churn on a monthly, quarterly, and annual basis in a shared Google doc.
- Task Success- How well a task is completed, how quickly, and the amount of errors committed on average when completing the task. “When testing lo-fi prototypes, I find it more helpful to measure task success qualitatively by tracking patterns in user feedback,” says UXPin user researcher Ben Kim. “Once you’re testing a hi-fi prototype, however, you can accurately track task success with a usability checklist”.
Which Metrics are Important to You?
Obviously, it’s impossible to narrow down the analytics you track to 5 simple metrics.
For a comprehensive understanding, follow a three step process:
First, you articulate your aims. Next, you identify patterns which point toward success. Finally, you build a metric to track those patterns. Let’s take a deeper look at each of these steps in turn.
Set up a meeting and let every team member chime in on their vision of success in the sprint. Different people may have different ideas. This is okay, preferable even. With diverse viewpoints represented, you can better piece together what your product should accomplish.
Just make sure you break down goals into project goals and feature goals.
Finally, always back up your articulation with solid user research. Base your goals on data from surveys, interviews, user testing, case studies etc.
Now that you’ve determined the goals for your product, think about what user actions will result in progress toward these goals.
These actions are your signals. Look for signals indicating satisfaction and behavior. Which trackable behaviors result in feelings of satisfaction? Or how about frustration?
List a number of possible signals and be as specific as possible when connecting them to the goals you’ve articulated. Don’t ignore signals for possible missteps either. Identifying pain points can help you build a better product sometimes much faster than concentrating on the positive points.
Before launching their Photoshop and Sketch integration, UXPin’s product team realized they would measure the following signals:
- The number of plugin downloads from the Chrome store
- The number of imported files
The final step is drawing trackable metrics from your goals and signals.
Hopefully, you’ve already identified a few by the time you’ve finished your signals phase. If not, try briefly describing your site or app. Write down your description, no longer than a short paragraph.
Describe how the user should ideally interact with your product. After you’ve written your description, reread it and underline all of the verbs in your sentences. Measure each word you underlined.
A week into the launch of their Photoshop and Sketch, UXPin’s product team checked if:
- Actions around creating hi-fi static designs in UXPin decreased
- Actions around hi-fi interactive design in UXPin increased
- Increasing number of people were using the new integration plugins
If all three criteria were met, the team could reasonably deduce that people preferred static design in a separate platform, then importing into UXPin for the prototyping.
There is no one size fits all set of UX analytics.
Every product is built for a different purpose and with different people in mind.
The best way to go about pleasing your users is to pay attention to how they interact, measure the key events specific to your product, and critically determine the areas where you can improve.
As always, remember to balance the quantitative metrics against qualitative user feedback.
For more UX advice, check out the free Guide to Agile UX Design Sprints. The 97-page playbook is based on 50 design sprints conducted by author and designer Alex Gamble.