Please explain how to design test cases from use cases?
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Before you can design the tests, you'll need to know what the use case is telling you. A good place to start for this question is the course on specification-based testing on TestingEducation.org. The course covers techniques that can be useful in reviewing use cases (active reading, context-free questions, etc…) and it covers the basics of traceability if the topic is new to you.
Assuming you've picked apart the use case in intricate detail, next you might want to look at some different test techniques. For this part we'll steal from the Satisfice Heuristic Test Strategy Model (PDF), which lists nine general test techniques. Use cases make claims about the application (or system) and quite often they point out the major functions of the application. The actors identified are often users (or other systems, which are also users). As you look at the primary and alternative flows, think about the scenarios created.
Of the nine general test techniques listed, the following are the four that I think are most commonly associated with testing use case flows and requirements:
- Claims Testing
Verify every claim
- Identify reference materials that include claims about the product (implicit or explicit).
- Analyze individual claims, and clarify vague claims.
- Verify that each claim about the product is true.
- If you're testing from an explicit specification, expect it and the product to be brought into alignment.
- Function Testing
Test what it can do
- Identify things that the product can do (functions and subfunctions).
- Determine how you'd know if a function was capable of working.
- Test each function, one at a time.
- See that each function does what it's supposed to do and not what it isn't supposed to do.
- User Testing
Involve the users
- Identify categories and roles of users.
- Determine what each category of user will do (use cases), how they will do it, and what they value.
- Get real user data, or bring real users in to test.
- Otherwise, systematically simulate a user (be careful -- it's easy to think you're like a user even when you're not).
- Powerful user testing is that which involves a variety of users and user roles, not just one.
- Scenario Testing
Test to a compelling story
- Begin by thinking about everything going on around the product.
- Design tests that involve meaningful and complex interactions with the product.
- A good scenario test is a compelling story of how someone who matters might do something that matters with the product.
If you're using all four of these techniques, you're going to generate a lot of test ideas and test cases. That's great. The next problem you'll have is to prioritize them. Most likely you'll know what test cases are most appropriate for your context, but if you need help, review a list of your test case ideas with other team members (testers, developers, analysts, and stakeholders) to prioritize where you spend your time.
Related Q&A from Mike Kelly
There are multiple ways performance testing can be handled on an Agile team. An expert describes the benefits of various approaches.continue reading
Every software tool is individually designed to meet various needs and requirements of projects, teams and project managers. Learn what tools experts...continue reading
Creating user acceptance tests out of basic software requirements documents can be a daunting task. Expert Mike Kelly points out logical approaches ...continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.