adam121 - Fotolia

How QA testers participate in the requirements definition process

Learn how QA testers should participate in the requirements definition process in this expert response.

How should QA testers participate in the requirements definition process?

Reviewing -- rather than defining -- is the appropriate role for quality assurance (QA) testers in the requirements definition process, but QA testers need to learn to perform reviews productively. For QA testers, the key to winning support for their involvement in requirements definition reviews is to find problems that are important to other participants, which means finding concerns with the requirements' content and avoiding too much focus on form and testability. In my book and related seminars, I describe many ways to identify problems with requirements -- including clarity and testability – as well as how to use powerful methods to detect wrong and overlooked content.

Some organizations make business analysts responsible for defining requirements and for developing tests to demonstrate that requirements have been met. Such a dual role often results in tests being neglected, by means of both inadequate attention and testing knowledge. Moreover, analysts' tests are unlikely to reveal problems with their requirements. Such weaknesses are exacerbated when developers are the ones defining the requirements and tests.

Therefore, there is a rationale for having independent skilled testers define and execute the requirements' tests.

Having them participate along with skilled analysts in requirements discovery activities, such as interviewing stakeholders, is a different point, though. Not only does it double the discovery costs and divert scarce QA testers from what usually is too little time for testing, but it's unlikely to add appreciable value. There's no reason to assume those focused on QA testing would have business analysis skills that dedicated business analysts lack.

If QA testing is involved early in the requirements and designs process, testers are able to prepare more tests and are ready to begin executing them when the code arrives. Early participation helps ensure requirements are testable and helps the QA testers better understand what to test, resulting in them writing better tests.

I've cautioned that merely gaining access to requirements reviews often backfires. Too frequently, the QA testers are not prepared to contribute productively in the first attempt. Others find them either to be no help, or worse, impediments, and exclude them from requirements reviews thereafter. Moreover, such bad experiences preclude second chances of ever returning. A main cause is what I call the "testability trap," which ironically results from QA testing pundits saying the main issue with requirements is lack of testability, which in turn is largely due to lack of clarity.

Thus, when they are involved in the requirements review process, QA testers tend to dwell on citing requirements they consider to lack testability, and expect analysts to rewrite them so they are testable. Analysts seldom have time or inclination to redo their work, and other review participants often consider such yammering about testability to be trivial nitpicking. It's ignored, and its perpetrators are perceived as not only not adding value, but also interfering with the usefulness of the review.

Next Steps

Learn how to improve QA tester productivity with time management

Dig Deeper on Software development lifecycle

Cloud Computing
App Architecture
ITOperations
TheServerSide.com
SearchAWS
Close