When thinking about the publicly available information on usability testing, my mind thinks of two things: mystery...
and expense. A quick Google search bears this out; I tend to find brief overviews with vague mentions of a "lab" environment and a lot of consultants selling training courses.
Fortunately, it does not have to be that hard, nor that expensive. If you have a budget, you could set up a series of rooms with one-way mirrors equipped with video recording tools to document both the user's expressions and the screen at the same time. Then we find a way to bring in, schedule and compensate our users for a few weeks while they try to use the software.
With that said, not everyone has an extensive amount of time to set up a lab and conduct interviews and analysis, complemented with a budget measured in hundreds of thousands of dollars. Instead, we might want to start with something small.
Two steps to consider regarding usability testing methods
Before anything else, usability testing is having a participant using the software in something close to real-use conditions. The elements are designed to break the groupthink phenomenon. When we build and test a user interface, we tend to understand it completely. In the book Made to Stick, authors Dan and Chip Heath call this the "curse of knowledge" -- that we can't imagine a world in which the other person does not understand the process.
For the first step, you will need candidates. As you find participants, decide what their process will be. For example, an accounting package might require registration, entering company information, setting up bank accounts and receiving payments. In this scenario, a realistic user is most ideal -- someone with enough booking or accounting knowledge to be able to use the software without a user guide. I'd suggest typing up these use cases, in some detail, and having them reviewed by a friend. Then, when it comes time to test, you can pass the use cases to the worker and have confidence that the confusion they have is about using the software, not about the task.
For a website, the participants might be answering questions that include, "Can they find a contact phone number and email?" or "Can they find common questions about our services?" You might get these questions from existing customers.
The second step is to brief the participant. Inform them your job is to remove obstacles from adoption and then find out when they are confused. Give the user the set of questions and watch what they do. If you see a confused expression; if the mouse floats around without clicking for an extended period of time; or, especially, if they say something or ask for help -- these are things to notice.
In Oleksandr Berchenko's article on software usability, he categorizes usability testing into a pyramid with utility at the bottom. Utility asks if the software is capable of doing what we ask it to do; testers can test for this without outside help. Berchenko splits usability into adoptability (is it an easy switch from what people know before? is it familiar?), learnability (can we pick it up quickly?) and efficiency (can I do powerful things quickly?). Berchenko puts identity at the top of the pyramid, where a user associates their sense of self with the application -- creating users who are unlikely to switch and even evangelists for the product.
You likely won't get that from a list of tasks and a user you recruited in the hallway, but you could get started tomorrow, and that's probably enough for now.
Usability testing is about finding problems in an existing user interface. If you find problems, it means rework. Walking users through prototypes or designing for user experience can help detect any issues earlier. Ultimately, you probably want both. Usability testing is relatively easy to add at the end this time, on this project that's 80% done already. It will be easier and cheaper to do if we design the experience first -- but that is a different article.
The difference between usability testing vs. user acceptance testing
How to bring UX Testing and software testing closer together
CTO says RSA aims for usability
Dig Deeper on Software Usability Testing and User Acceptance
Related Q&A from Matt Heusser
Common software security mistakes include testing at the last minute and not testing open source code and VMs. Expert Matt Heusser suggests ways to ... Continue Reading
You can't just 'do' DevOps and hope to get it right. Expert Matthew Heusser takes us through all the steps required to make DevOps work for your ... Continue Reading
Your boss wants you to 'do DevOps.' Expert Matthew Heusser offers time-tested advice for getting started down the DevOps process. Get ready for a lot... Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.