As he sat staring at the email requesting his presence at the requirements review session, his trip to the dentist...
later that day suddenly didn't seem so bad.
All kidding aside, software requirements review sessions are anything but fun. The prospect of walking through page after page of documents, dissecting paragraphs and terms for hours on end trying to verify for completeness, accuracy, clarity, and fulfillment of business needs is very daunting.
As difficult as these sessions can be, they're often made even worse by the fact that people regularly don't read the documents ahead of time, come to the meeting without all relevant materials, are distracted by "urgent" issues on their mobile device, or drift off pondering an upcoming dinner with the in-laws. Simply stated, these requirements walkthroughs are not very effective and end up getting repeated multiple times. Even then, people sign off not so much because they have confidence in the result, but because of schedule pressure and reduced risk of personal impact ("I'm not the only one who signed," "There's enough wiggle-room," "We can just blame it on the developers or testers" …). My personal favorite was, "Your absence from the meeting will be interpreted as acceptance of the requirements."
It doesn't have to be this way
What if software requirements review sessions could actually be fun? If people actually requested to be on the invite list? If attendees could feel confident about what they are signing their name to? As the world of business analysis matures rapidly, so does the requirements walkthrough.
When the communication of requirements via "legal-style" documents begins to break down, think about how people communicate. They become animated, talking with their hands, drawing on a whiteboard. The existing application may be started up, and they point to the areas to be changed. Block diagrams or workflow pictures start to emerge. They may even dedicate some time to build a small prototype. This is great stuff. It can be very expressive and a great way to get an idea across.
But these great techniques often fall down. Why?
- The techniques are inconsistent across people. Some are good at it, and some aren't.
- It's difficult to impossible to ensure quality (completeness, consistency, integrity, etc.).
- It's virtually impossible to maintain integrity of this information as things change.
- It's inflexible in that you can't easily "drill down" to the details, or abstract up to the big picture when needed.
- Typically there's an array of tools used to support the various techniques and they're not integrated.
So how do we get the benefit without all the bad stuff?
The goal of any requirements walkthrough is to let your team be expressive without sacrificing the manageability of the information. Look for solutions that allow you to ensure the ongoing integrity of the information you're dealing with, while at the same time letting you communicate it in a variety of ways to suit the message and audience.
Ideally you want a solution that enables you to both create a requirements model and simulate that model. The model provides the "nuts and bolts" of the solution. It lets you ensure everything fits together, that there are no overlaps or inconsistencies. It lets you drill down to the details or up to the big picture, and gives you a framework to maintain the information as things change.
Simulation is the ability to bring the requirements to life similar to a prototype that you can execute and interact with. Simulation lets you be highly expressive when it comes to communicating with stakeholders -- it's sort of like watching the movie instead of reading the book. Ideally, you also want "multi-aspect" simulation. This is like letting each stakeholder independently control their own camera angle when watching the movie. Future end users can view the simulation from a user-experience perspective, interacting with functioning screen mockups of the user interface, and the designers and developers can view the simulation from a workflow perspective, considering business rules and monitoring how data will be used and acted upon.
Seeing the requirements from these different angles lets everyone truly understand the envisioned application. Feedback is informed and far more valuable. The number of review cycles, as you might imagine, drops.
So what does one of these requirements walkthroughs look like?
One of the first issues is that it's difficult to schedule a software requirements review with all the people you need. There are four approaches to solve or at least mitigate this one: 1) Attempt to coerce the person to join in a number of ways (e.g., have their boss make them attend). 2) Have them review the requirements on their own and submit feedback. 3) Make them want to come to the session. 4) If the reason they can't attend is primarily due to geography -- they're not at your location at that time -- reviewing remotely via Web meeting technology is certainly a viable option.
Coercion is likely the least effective option since, even if you do manage to get them to attend, you haven't set the stage for a cooperative, collaborative environment. Having those involved review the requirements on their own generally isn't as good as reviewing them together in person, since things can be misinterpreted which can take more cycles to resolve. Having people actually want to attend the session is the ultimate goal. Accomplishing this basically means two things -- make it valuable to them, and remove the pain. Using new requirements communication technologies like the simulation described above can achieve this. Let's walk through a scenario of preparing and conducting such a session.
Know your audience: Provide the right advance material to the right people
Jay finishes his model and has plenty of time to prepare for the scheduled requirements walkthrough three days from now. Remembering a lesson learned from the last session, he auto-generates the requirements document and a traceability matrix from the model, sends it to the color printer, and leaves a hardcopy on Mark's desk. Mark is perhaps the only team member who always reads requirements documents before review sessions. He's been doing it consistently for years and from the last session it's obvious he isn't going to be changing this habit easily, even with the new simulation approach. The generated document is a complete account of the requirements, from the high-level business need through to use cases, screen mockups, and data dictionary with complete traceability throughout. Once this is done, Jay generates all the functional tests. These have complete coverage of the requirements in the model. These are primarily for John, the QA lead, who wasn't able to come to the first session. John's boss Terri was at the first session, so using a filter Jay generates only the changed tests for Terri. Having QA walk into a requirements review session with a complete set of corresponding tests in hand wasn't even dreamt of before.
In advance of a session, you need to consider whether it would be more or less effective to provide documentation. On one hand, it gives people an opportunity to pre-read so their input is potentially more valuable, and they can take notes during the session. On the other hand, you could be giving them a distraction. The goal is to have everyone on the same page during the session, and having documents can be an invitation to wonder. In this example, Jay weighed the alternatives and supplied specific documentation to certain individuals. The point is that you need to think this through and make the call.
Lead through simulation
Jay begins the session by giving a quick five-minute overview of the simulation approach and technology they'll being using for the session. He then sets a simple layout of windows for the screen that displays workflow, a large area for screen mockups, and an area to record comments. Jay then begins by running through the textual requirements they reviewed last time and shows the changes made. When everyone agrees with the changes, he launches the simulation. Jay walks the group through the main scenarios he selected beforehand.
Performing a review using simulation is a new approach for most stakeholders. First impressions are key. Make sure you plan out how you're going to introduce people to this new approach and technology and get them comfortable with it so their focus of attention is on the content during the review.
The layout of your simulator is also very important. Different configurations are often available depending on what aspect you're trying to see. You need to try the various configurations beforehand and decide on one that is optimal for communicating with your constituents. For example, if you only want to communicate the effects on data at certain times, this information should be hidden at all other times so as not to clutter the experience.
Start with the "prime capabilities"
Even moderate projects have enough requirements to be overwhelming. The manner and sequence in which they're presented is important if you're going to communicate effectively. I recommend starting the walkthrough with main scenarios, or "prime capabilities" as some call them. Only after these are reviewed and agreed upon should you begin to tackle the various alternate and exception cases.
The simulation consists of a mixture of low-fidelity wireframe screens interspersed with very detailed and realistic screens in areas where the behavior is complex or the changes are very detailed. During the very first scenario the end users question the sequencing of screens to accomplish the task being shown. After a bit of discussion, it starts to sink in quickly. The two subject-matter experts sheepishly look at each other, realizing they've missed something substantial. Assumptions were made and they hadn't realized a few things have changed on the trading floor since they used to be there. It was clear by the looks around the room that everyone was thinking the same thing: "If we hadn't caught that…" People around the room unconsciously lean a little forward. The manager in the corner looks up from his BlackBerry and stops typing. Everyone is engaged. Two smaller issues are quickly discovered -- one by John, who continuously switches focus between the simulation and his auto-generated tests. Then a few moments later a second significant issue is exposed.
Apply basic facilitation rules
I've seen this happen several times, and just this quickly and dramatically. Some basic facilitation rules apply here -- ensure you keep the session on track. Record comments, make everyone feel included by prompting the quiet ones when appropriate, use the "parking lot" when conversations stray in order to record items to be dealt with later, and keep to your scenario plan in spite of people wanting you to go down alternate and exception paths. Before each scenario, state the path you're going to go down and do not stray.
The entire room is now intently focused on the simulation, everyone trying to be the next to spot a future problem from their unique perspective. Jay is capturing comments almost continuously in the simulator alongside the screens and workflows. Determined to keep up, he now wishes he took the advice to have a colleague focus on facilitation while he ran the simulation. With a little more practice, he'll even be able to modify the model live in the session facilitating the collaboration by running "what-if" scenarios.
Team up: Scribe + Facilitator = Success
These sessions often provide a significant amount of feedback, and it's helpful to have a colleague run the first session with you -- one plays the role of scribe while the other facilitates. You can exchange roles if the session is long, or maybe if you switch sets of requirements.
Do a dry run together beforehand
It is highly recommended that you dry-run a portion of the session beforehand. There's no better way to expose shortcomings and identify where you need to focus attention.
After an hour and a half Jay zooms out on the workflow diagram to see all its parts shaded, indicating there's nothing they haven't covered. He commits the model and its new comments, making a new version. Many begin to realize they've just finished in less time than scheduled for their typical requirements walkthrough. People linger in the room talking about all the problems they uncovered and making guesses as to what the costs would have been had they not been exposed. Jay thanks everyone, closes the lid of his laptop, and heads down the hallway with a couple of people in tow, peppering him with questions about the product and the new process.
This isn't your father's software requirements review session. The unthinkable has happened. People from the business, end users, architects, designers, developers and testers are all engaged -- collaborating, exposing and solving issues, and doing so early in the development cycle. Not only that, they're actually enjoying themselves! The effect of discovering issues quickly becomes noticeable at the project level, as project durations shorten and fewer crises occur. This isn't a dream. It's actually happening at a growing number of companies today. Requirements walkthroughs have become fun and the trip to the dentist has returned to its rightful status as the most loathed appointment on the calendar.