Which system do you use for managing test cases and test results? Can this system also be used to manage requirements in agile development?
My team uses a wiki to document specifications for each user story, including examples of desired and undesired system behavior. Our product owner gives us a “story checklist” spreadsheet that includes the business conditions of satisfaction and impacts on various other parts of the system and the business, such as reports, legal considerations or interactions with vendors or partners. The wiki pages for each user story include high level test cases, manual test cases and links to detailed executable test cases, which are automated. Mockups of user interfaces and reports are put on these pages, along with photos of flow diagrams and examples drawn on whiteboards during brainstorming meetings and design discussions. We also may make notes about the exploratory testing that is planned or completed.
We take high-resolution photos of our task board plus any new whiteboard drawings and post those on the wiki every day. These are helpful for our remote team member, and form a useful history later of how development proceeded. We sometimes draw mind maps to work out complex processing or test cases, and include photos of those in the documentation.
Many of our automated tests become part of the regression test suites. These run in our continuous integration many times per day, and our CI tool reports the results, and sends alerts about failures via email. Regression test failures are addressed immediately. Since the tests must always pass, they are updated as needed when changes are made to the software, and they provide excellent documentation of how the system behaves. Most of our test results are from the automated builds, but anyone doing manual tests puts his or her initials on the wiki page next to each test case completed.
Our wiki and our automated regression tests are an incredibly valuable knowledge base, and we invest time to make sure the information we need gets into the wiki, and that we can always find it later when we need it. We organize the information by functional business area, and cross-reference it by iteration. For each sprint, we have a page summarizing all the stories worked on, with links to the pages that have the details. As we complete each story, we add screenshots and other documentation to the sprint summary page to help with the sprint review and so that months and years later, we can easily remember what was done in that sprint. We also document any high-severity bugs fixed in the iteration.
We like wikis because they encourage collaboration. Anyone can update the wiki page, adding comments or a screenshot or a photo. They are easy to keep up to date. However, they can also easily get out of control and disorganized. It can be difficult to find the information you’re looking for. We’ve spent a lot of time creating a good table of contents and keeping the information well-organized. I recommend having a technical writer on the team, at least part time, to help organize and manage this valuable corporate knowledge.
Dig Deeper on Topics Archive
Related Q&A from Lisa Crispin
Agile leader Lisa Crispin explains a more organic, more Agile approach to test reporting. Continue Reading
When it comes to Agile planning, average time over many iterations is a more important metric than individual story estimates. Continue Reading
Most inexperienced Scrum teams overcommit on what they will deliver, and when. Agile leader Lisa Crispin says that does more harm than good. Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.