When it comes to Agile testing, there are plenty of answers to test the core functionality. Rebecca Wirfs-Brock is interested in many aspects of non-functional testing, with security, usability and performance chief among them.
She is joined by Ian Savage, a senior software quality engineer at McAfee. We discuss the challenge of these types of measurements. They need to be ignored until there is a crisis. Even teams that test for these sorts of things tend to do the testing one time only, toward the end of the project.
Ian explains that in his coaching work, he starts small: Begin the project with three key nonfunctional metrics, build tools to measure and report on those metrics continually, and then manage to the metrics. An example is the expected performance with 1,000 simultaneous users: While average page response time sounds tempting, it might be better to use average page response time plus standard deviation, measuring typical worst-case results.
While most Agile teams don't measure these things automatically, there are a handful of tools that measure software attributes that can tie in automatically into continuous integration. Two we discuss are SOASTA and Telerik's performance test tool.
Finally, we discuss how hard it can be to gather these metrics; teams may be better off characterizing the performance of the system and using the reaction to those numbers to generate additional story work if needed.
My lesson learned is this: Develop at least some strategy to address nonfunctional testing early in the process. Even if the strategy is to defer nonfunctional testing, it helps to state that publicly. That way, when a crisis occurs, you will be able to get past blame and move on to fixing the problem.