Manage Learn to apply best practices and optimize your operations.

Measuring user experience can produce objective metrics

You can't measure user experience with a ruler, so measuring user experience objectively is generally a matter of aggregating subjective opinions.

What are some good objective methods for measuring user experience on an application?

That's a tough question. Let's explore what a good objective measure of user experience might mean and what it could look like.

User experience is subjective by nature, so finding an objective measure is tough. Surveys can be objective if the questions and results are presented well. Saying, "The majority of our customers rate this as an eight out of ten" is a lot different than, "I think it's an eight."

It is important to recognize that user experience is not the average of what people feel. Each user has an individual experience.

However, my clients often find survey data unacceptable or otherwise undesirable. They want a single number pulled out of a computer, something like click-through rates, time spent on the page or page load times. All of those can be aspects of user experience; however, looking at those metrics actually provides a picture of application performance.

The next challenge is to come up with the right questions to ask. The questions should be precise because company performance will be measured against the questions as asked. Questions for measuring user experience should address a customer's likelihood to use the software again and to recommend it to others.

James Surowiecki, author of The Wisdom of Crowds, claims in his book that large groups tend to come to the right answer more often than individuals do. Imagine a company with tens of thousands of customers that could question any block of one thousand at a time and the answer always came out about the same -- somewhere between 8.3 and 8.4. The organization could reasonably say the objective measure of the application's user experience was about 8.35.

Still, averages don't always tell the whole story. Imagine measuring user experience by analyzing response time for library software. The software has two broad categories of users: inner-city librarians and those in rural areas with slow connections. Use is split almost exactly evenly. If we only measure the average, the speedy performance of the urban libraries might offset and hide rural library performance issues.

Even worse, the slow performance could vary by screen, with signup and login taking so long the users abandon the process, while everything else is blazing fast. Again, it is important to recognize that user experience is not the average of what people feel. Each user has an individual experience. When looking at large populations, it might be better to look at, say, the percentage of people (or pages) at 0-1.99 seconds to serve, the percentage at 2.0-3.99 and so on. A histogram is a graph that can help with visualizing these numbers.

A histogram of hypothetical transaction response times
A histogram of hypothetical transaction response times.

It is tempting to look at a histogram, see that most of the users have good responses and throw out the outliers. Avoid this temptation. Those outliers are often the best source of new product ideas or potential extensions of the software.

We started off with an attempt to gather a single number for measuring user experience that stands up to scrutiny. It may be possible to look at the user base as a whole, to let the users tell you what matters to them. However, if those users care about different things or are trying to get different things done, then I suggest getting past the averages by looking at the data using a histogram.

When opinion surveys are not acceptable, then response times might be your best bet from a numbers perspective. Where numbers aren't attainable, sometimes Goldilocks's measurements ("too big," "too little" and "just right") are a better fit.

Next Steps

A great mobile app user experience starts with building trust

Why software developers need to keep customer experience in mind

Dig Deeper on Topics Archive

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

How do you measure user experience? Share your tips and tricks in the comments.
I think you need a mix of metrics to truly track user experience - things like visit duration and pages per visit are important, but also whether users are easily able to complete their tasks (and take the actions you want them to). Does a user click the right button, download the right thing, get to the right place? These can reflect that you've created a simple experience that allows people to do what they came to do. 

I think surveys can help, but only if you get a critical mass of respondents. 
No need for surveys or data modelling.

There is actually a new software solution to accurately measure User Experience of systems, applications, logon delays, websites.   Its software from Logfiller.com
In the days of yore we'd simply ask, then tally the comments in a spreadsheet so we could parse the results. Now we're infinitely more technical. We ask, then tally the comments in a big online spreadsheet so a computer program can parse the results. It's not much better though our reach has grown much broader. In both cases, the results are very important to us though we're finding that too much data is obscuring the most vital bits.
We interview the users. My team is in the fortunate position of developing applications used by only a very small group of internal users. We don't develop software for public use, so that makes it so much easier.