Manage Learn to apply best practices and optimize your operations.

Agile test reporting doesn't have to be a headache

Agile leader Lisa Crispin explains a more organic, more Agile approach to test reporting.

What's your view on test reporting in Agile projects? I know Janet Gregory has a "lightweight" test plan, but I haven't found a lightweight test report. Do you think a traditional test report should be written at the end of the project? How else could you report the test results?

Lisa CrispinLisa Crispin

In Agile projects, we continually collaborate with customers. We discuss user stories with them before and during development, and show them a demo once a story or iteration is finished. Our story board -- physical or online -- shows the progress of the stories along with the testing activities.

When I worked on Waterfall projects, I'd provide an "executive summary" of information about each feature to help the business managers decide whether to release the software. In Agile, we simply discuss the features and decide whether to release through discourse. Even in a big enterprise, there have to be product managers who work with the development teams. They should have face-to-face communication, whether in person or by teleconference.

I haven't seen any need to write a test report at the end of a project.

I haven't seen any need to write a test report at the end of a project. If I have concerns about quality or process issues, I bring them up at the retrospectives. The whole team -- including the product owner, product manager or similar role representing the business -- discusses the problems. As a team, we come up with small experiments to try to address them.

The most important information comes from the story board, kanban board or whatever big visible chart (again, online or physical) the team is using to keep development transparent to the business. This includes testing, which is not a separate activity but an integral part of developing the software.

Teams that succeed over the long term deliver the business value their customers want frequently, but at a sustainable pace. They start each feature by getting examples of desired and undesired behavior from customers, and turn those examples into tests that guide development at the unit, functional and nonfunctional levels.

After the tests pass and the code for that feature is done, those tests become not only regression checks, but living documentation about how the system works. Rather than put effort into narrative reports on testing, work to make automated tests understandable to everyone who needs to know how the system works and what is covered in the tests.

Dig Deeper on Topics Archive