Transparency is highly touted as an attribute of agile development. Project team members, management and stakeholders...
all need to be on the same page on the status of a project. Team members may be saying everything is on track but without any evidence or artifacts, how does anyone know? In this tip, I describe an experience I once had in which, because of a very user-unfriendly requirements management tool, no one understood the status of the project. By exporting the data onto a whiteboard status became visible, resulting in some interesting findings and reactions. Requirements tool allows input but doesn’t provide output
Once I worked on a project that used a software tool to manage the requirements for the project. This tool made it very easy to enter requirements into the system, but it was very difficult to read the requirements once they had been entered. Among other things, the requirements were managed in hierarchies, but the tool made it impossible to see what level of the hierarchy each individual requirement belonged to. That is, you could see the lowest level requirements, but there was no way to know from what hierarchical levels that requirement had been derived. It was extremely confusing.
So no one read the requirements. Ever. The development staff just coded away based on conversations and guesses.
Eventually I started spelunking in the requirements management tool, and I began to find any number of requirements that were a complete surprise to me. But there were well over 500 individual requirements for the whole system, and it was literally impossible to know how or when each requirement had been entered into the system. It was extremely frustrating, and to me as a tester, this looked like big trouble.
Since the requirements management tool was essentially write-only, I took it upon myself to find a way to extract all of the requirements from the tool itself and to represent them to the rest of the team in some sort of reasonable fashion. James Bach's notion of the "low tech test dashboard" was several years old at that time, but the agile community was at that time excited over the notion of "big visible charts," a related concept. I thought I might be able to do something useful along those lines. If I could make some sort of sense of this set of requirements, then display the team's progress in terms of meeting those requirements in some sort of meaningful way, it would be helpful for our planning and analysis.
Exporting the data from the tool to a whiteboard
The requirements tool only allowed one kind of export: to a Microsoft Access database. But Access allows export to an Excel spreadsheet. And from the spreadsheet, I could bring some scripting tools to bear to get a sense of the areas in the system being described by the requirements. Because of the way the exported data were represented in the spreadsheet, it took some Perl hacking to organize the exported data in a form that really made sense to the casual observer.
After a day or two fussing with the exported requirements data and my Perl scripts, I was able to sort all of the 500+ system requirements into about twenty groups that I named "feature areas." In the room where we had daily team meetings, I took over a whiteboard and wrote down my twenty feature areas. Next to each feature area, I put a green smiley face if the feature was tested and known to be working. I put a red frown face next to each feature area that was tested and known not to be working. I put a yellow ambivalent face next to each feature area about which the testing/QA department had no information.
My whiteboard was definitely a "low tech testing dashboard," because it showed the state of the testing effort, but it was also a "big visible chart" in the agile sense, because it showed status and progress for the whole project.
Whiteboard dashboard exposes project issues
My low tech testing dashboard pointed out an interesting problem. According to the Gantt chart that we were using for official status reporting and release estimation, the project was about 80% complete. According to my low tech testing dashboard, the project was about 30% complete. My chart displayed just a few islands of green in a sea of yellow and red.
Not much changed on my dashboard over the few weeks after I put it in the team room. The yellow parts mostly stayed yellow. One or two yellow parts turned red. The red parts mostly stayed red.
The team met in the room with the whiteboard daily, and it was a very strange experience to hear people reporting that we were close to being ready to release when the big visible chart clearly showed we were nowhere near ready to release.
Eventually, the cognitive dissonance apparently became overwhelming and a manager demanded that I erase the chart.
What I learned
That system shipped. Two years late. And it was very buggy. But that was long after I quit that job. The experience helped me realize the importance of visibility into a project. Though some teams prefer not to expose their lack of progress, this can only ultimately lead to poor quality, dissatisfaction and schedule delays.
Requirements and the team’s progress towards meeting those requirements must be visible if the team has any hope of success. Find a tool that will track and display requirements appropriately, even if it means creating a dashboard on your whiteboard.
Dig Deeper on Software Requirements Tools