Measuring user experience can produce objective metrics

You can't measure user experience with a ruler, so measuring user experience objectively is generally a matter of aggregating subjective opinions.

What are some good objective methods for measuring user experience on an application?

That's a tough question. Let's explore what a good objective measure of user experience might mean and what it could look like.

User experience is subjective by nature, so finding an objective measure is tough. Surveys can be objective if the questions and results are presented well. Saying, "The majority of our customers rate this as an eight out of ten" is a lot different than, "I think it's an eight."

It is important to recognize that user experience is not the average of what people feel. Each user has an individual experience.

However, my clients often find survey data unacceptable or otherwise undesirable. They want a single number pulled out of a computer, something like click-through rates, time spent on the page or page load times. All of those can be aspects of user experience; however, looking at those metrics actually provides a picture of application performance.

The next challenge is to come up with the right questions to ask. The questions should be precise because company performance will be measured against the questions as asked. Questions for measuring user experience should address a customer's likelihood to use the software again and to recommend it to others.

James Surowiecki, author of The Wisdom of Crowds, claims in his book that large groups tend to come to the right answer more often than individuals do. Imagine a company with tens of thousands of customers that could question any block of one thousand at a time and the answer always came out about the same -- somewhere between 8.3 and 8.4. The organization could reasonably say the objective measure of the application's user experience was about 8.35.

Still, averages don't always tell the whole story. Imagine measuring user experience by analyzing response time for library software. The software has two broad categories of users: inner-city librarians and those in rural areas with slow connections. Use is split almost exactly evenly. If we only measure the average, the speedy performance of the urban libraries might offset and hide rural library performance issues.

Even worse, the slow performance could vary by screen, with signup and login taking so long the users abandon the process, while everything else is blazing fast. Again, it is important to recognize that user experience is not the average of what people feel. Each user has an individual experience. When looking at large populations, it might be better to look at, say, the percentage of people (or pages) at 0-1.99 seconds to serve, the percentage at 2.0-3.99 and so on. A histogram is a graph that can help with visualizing these numbers.

A histogram of hypothetical transaction response times
A histogram of hypothetical transaction response times.

It is tempting to look at a histogram, see that most of the users have good responses and throw out the outliers. Avoid this temptation. Those outliers are often the best source of new product ideas or potential extensions of the software.

We started off with an attempt to gather a single number for measuring user experience that stands up to scrutiny. It may be possible to look at the user base as a whole, to let the users tell you what matters to them. However, if those users care about different things or are trying to get different things done, then I suggest getting past the averages by looking at the data using a histogram.

When opinion surveys are not acceptable, then response times might be your best bet from a numbers perspective. Where numbers aren't attainable, sometimes Goldilocks's measurements ("too big," "too little" and "just right") are a better fit.

Next Steps

A great mobile app user experience starts with building trust

Why software developers need to keep customer experience in mind

This was last published in July 2014

Dig Deeper on Software Usability Testing and User Acceptance



Find more PRO+ content and other member only offers, here.

Have a question for an expert?

Please add a title for your question

Get answers from a TechTarget expert on whatever's puzzling you.

You will be able to add details on the next page.
Related Discussions

Matt Heusser asks:

How do you measure user experience? Share your tips and tricks in the comments.

4  Responses So Far

Join the Discussion



Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: