Manage Learn to apply best practices and optimize your operations.

Creating clear software requirements specifications

Testers often complain that software requirements specifications are too vague, but verbose requirements can have the negative impact of being so wordy that they can't be followed.

Software testers often complain that software requirements specifications are too vague to be tested. How do you determine whether a requirement is fully developed?

One of the biggest requirements issues is "lack of testability," which is largely due to unclear or ambiguous requirements. When a software requirement specification is unclear, software to meet it is likely to be designed and developed incorrectly; and tests to confirm that the code meets the unclear requirements also are likely to be wrong.

That said, I think testers protest too much about software requirements specifications being too ambiguous. Testers who get worked up about ambiguity often alienate those they are working with. They may demand that the analyst rewrite the requirements before they will start testing the code against them. However, the analyst doesn't have the time or desire to redo his or her work. The project already is late, so testing can't wait any longer. Others perceive the tester as a trivial nitpicker who is wasting precious time with a broken-record refrain. The tester aggravates the situation by complaining that, "Nobody else cares about quality."

Write positive and negative tests that demonstrate the code works the way you think it should to satisfy the requirements as you interpret them.

Not only is the tactic ill-advised, but the lack of testability premise is wrong. Eliminating ambiguity (as testers often demand) is not practical. For example, millions of words have been written trying to remove ambiguity from the Internal Revenue Code, only to make the tax law virtually unintelligible.

More importantly, clarity is a form issue, not a content issue. Software requirements specifications can be perfectly clear and perfectly wrong; clarity and testability are irrelevant for an overlooked requirement. Focusing exclusively on testability actually interferes with finding the more important wrong and overlooked requirements content issues.

Stop wasting time and good will yammering about testability. Instead, try this approach: Write positive and negative tests that demonstrate whether the code works the way you think it should to satisfy the requirements as you interpret them. If your interpretation is correct, your tests are all set. If your interpretation differs from the developer's, the concrete nature of the failed test makes it easier for everyone to understand the requirements issue and determine what the code or test should be.

Instead of rewriting requirements, which often won't happen, use the tests as a supplemental form of requirements. This very simple, straightforward approach existed long before Agile folks think they invented it. Regardless, it works. You can take advantage of it with any methodology and without buzzwords.

Next Steps

Testers' involvement in requirements gathering important

Quality assurance and testing's role in requirements

Software testers' professional development

Dig Deeper on Topics Archive

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

Do you find software requirements specifications to be too vague? If so, how do you resolve this issue?
I think requirement specs take a lot of work to get right. More specifically, they take a lot of team work to get right. Starting with testable requirements is a great idea if your team can manage it. I really like Alan Parkinson's BDD method with Cucumber for a shining example of testable requirements. But you don't have to go that far. You do have to work on requirements with both the development team that will implement them and the business folks that are asking for them, preferably at the same time. How else do you get them to agree?
Remember, ‘specification’ is a synonym for design, which is a human-defined way (how) to address the business deliverable "whats" that provide value when satisfied. Business deliverable "whats", that I call REAL business requirements, exist within the business environment and must be discovered. Software requirements specifications do need to be clear, but regardless how clear they are, they also must be accurate and appropriate, which is unlikely when the "whats" are not identified accurately as well as clearly.
In most of my deliverables for specifications,  I add a line: "NO functionality will be provided by the software except what is described in this specification".

It wakes up a lot of stakeholders and participants in the process to think very carefully about what they expect, and to make the effort to see it documented, or they can expect they WON'T see it in the final product.  It works wonders...
I’ll caution that this and other variants of CYA (cover your assets, after all this is mixed company…) seems like good advice until it backfires on you. Somebody sooner or later is going to take issue with you about some functionality that seems obvious and necessary to them that you have failed to address consciously or more commonly unconsciously. When you point out your caveat, they’re going to say something to the effect, “I don’t care, it’s needed, you should have known it; and I don’t like your weasel nitpicking attitude either.” They’re usually also in a position that makes you wish you hadn’t been called on this.
It's a complex and difficult problem, as we have more people using technology (directly) and too many product managers assume that engineers can do all the translation need to implement "solution for problem XYZ". Now that so much software-interaction is done with 24x7x365 systems (eg. SaaS, etc.), I wouldn't recommend people spend time trying to improve processes for packaged software. Ship early, ship often, integrate CI/CD models for testing, do frequent A/B testing and measurement, integrate feedback mechanisms for end-users.  
Adding a little fuel to this fire... if you are doing your requirements gathering and you don't have a software tester in the room who is adept at picking apart and provoking discussions, seriously, start doing that. Institute a "Three Amigos" approach to stories or requirements gathering. Have a PM, a programmer and a tester discuss the story, nd get in as early as possible. Will you flesh out every conceivable option? Probably not, but I'll bet you get a lot clearer requirements from the process. Caveat: I happen to be a software tester, in case it wasn't obvious ;).
Thank you for your comments. You raise some important points. Michael, as one of the few folks with genuine chops in both requirements and QA/testing, I applaud your emphasis upon involving testers—as well as others–in reviewing requirements. Too often, though, testers especially confuse clarity with correctness and have a habit of nitpicking trivialities that lose value and support. Also, I’ll caution against having testers participate in the requirements discovery activities, which easily becomes a distracting “ganging up” that probably wouldn’t produce desired results, even if managers were willing to absorb its double costs. I discuss this further in a forthcoming Ask the Experts response so won’t elaborate further here. Brian, I agree that “product managers often assume that engineers can do all the translation need[ed] to implement ‘solution for problem XYZ’” but fear many others share said assumption, including especially the engineers. I also find that the presumed solution often turns out not to be right, in turn often because the presumed XYZ turns out not to be the real problem. I’ll also add fuel to the fire by asking how effectiveness is likely affected by the Agile practice of having developers go directly from user stories to code via “conversations” and thereby essentially eliminating explicit software requirements specifications. In addition, Brian, I’m missing your point regarding not spending time trying to improve processes for packaged software. Could you please elaborate and relate it to too-vague software requirements specifications.
Robin, in the earliest requirements gathering, I agree that the tester should not be so concerned with the "correctness" of the requirements as they should be there as an advocate for what it should do and how it might accomplish those goals. As a tester, I am there to provoke a conversation, not get in the way of the requirements gathering. Also, one of the benefits of having a tester early in the process is to help make sure that all sides are speaking about the same things with the same vernacular. I won't say that it is a universally good approach, but it's been effective where I work ;).
Michael, I’m not sure what you’re referring to when you say you “agree that the tester should not be so concerned with the ‘correctness’ of the requirements.”  I believe I said just the opposite, that concerns about clarity are largely irrelevant and often counter-productive unless the requirements first are correct.  To provide value, the tester needs to focus primarily on content and only secondarily on format; and I fear too few testers come prepared to help adequately with content. 
I do recognize that sometimes lack of clarity can interfere with ascertaining correctness.  I also recognize that such lack of clarity can come from using the same terms with different meanings (such as, I fear, the different things you and I mean by “requirements”) and different terms with the same meanings. I find that such terminology issues can be very visible but too often are far less important than the emphasis put on them.  I also suspect that most of the vocabulary issues and most of the typical tester’s requirements review pertain to prematurely-defined presumed products, rather than to the far more important REAL business requirements content that the products must address.
How about making sure that you have a reasonably unambiguous set of requirements in the first place? And that doesn't necessarily entail verbosity, in fact conciseness complements clarity.
Another issue is to get all parties involved. One department may request a feature that can hinder another or increase another departments work load. They may not have the equipment or the  manpower to deal with the new changes. I have seen this happen a lot. Usually with automating an existing application or process. Somebody usually always forgets a step and it fails. Or after using it for a few weeks is removed because the "benefit" they thought they would get did not materialize.
Thank you for your comments. @Patrick, I agree completely on the desirability of doing things well in the first place, following a standard of reasonableness, and keeping things simple. Would that it were that simple. I write something that seems unambiguous to me but unfortunately not to you, and vice versa; and I fear the simpler it seems, the more subject it is to multiple and mis-interpretations.

@Todd, I agree too that overlooked stakeholders are perhaps the biggest source of missed requirements. However, I think you will find that confusing features with requirements creates equal or even larger problems of the type you describe. It is natural to think that particular features which seem like the answer to one’s problems are the requirements; but it is equally common that those features turn out not to provide desired benefits, because features are responses to presumed requirements rather than the requirements themselves. When what I call the REAL business requirements _whats_ that provide value when met are not defined adequately, it is unlikely the features _hows_ can satisfy them.
Software requirements are never fully developed, ever. This is part of why waterfall style development struggles so much. While a person is off crafting a detailed requirement based on what they understood from customer conversations, the customers needs and desires are evolving.

It's hard to hit a target date when then specs keep changing. We started having users sign-off on their request. This way one project did not monopolize our time. In the past they kept adding revisions and modifications during the testing project an it took forever to close the request.
In my analysis, the main reason software requirements are never fully developed and specs keep changing is that people leap to what actually is high-level design of the software product _how_ without adequately understanding what I call the REAL business requirements deliverable _whats_ that provide value by achieving objectives when satisfied by a product/system/software _how_.

Understandably, product designs change rapidly when what they need to accomplish has not been identified, mainly because conventional practices focus almost entirely on the product being built, which too often turns out not to be what is needed. In contrast, REAL business requirements tend not to change nearly so much. What changes is mainly awareness of the REAL business requirements, which unfortunately occurs in the least effective and least efficient manner possible, by wasting time building and rebuilding the wrong software product.

It’s also important to realize that both types of requirements are hierarchical. Prematurely diving into detail of either is always likely to lead to difficulties. Start by discovering the top-level REAL business requirements, selectively elaborate them, and only then bother to design/specify requirements features of a software product that will satisfy the selected REAL business requirements. That’s the quickest and cheapest way to deliver working software that actually provides REAL value.