In the morning, Scott Barber presented a half-day workshop on quick and easy performance wins. In the presentation, he split the hard engineering performance testing work — say, simulating and cranking up load –from the “easy stuff.” To do the easy stuff, Scott had four concrete suggestions:
1) Know the mission
Instead of stating with a document — “100% of web pages shall load and display within five seconds” — Scott proposes a model of conversation to collaboratively explore and figure out what the goal is – before testing begins. He calls this “accept / converse / understand.”
2) Am I annoyed?
Scott suggests (and I agree) that it’s actually pretty hard to make decisions on 4.8 seconds verses 6.5 seconds. So instead of working for hard numbers, you can simply have testers write down their annoyance on some scale, perhaps 1-5, at various key points as the test suite runs. (Ideally, do it under simulated load; but if performance stinks for one user, that will tell you a lot in and of yourself.)
3) Watch the trends
Scott suggested that developers can write timers, wrap unit tests in those timers, and simply observe trends to see if the software is getting faster or slower. Likewise, the team can do the same thing to document acceptance and human-run testing. Several people in the room observed performance with a stop watch.
That’s just the tip of the iceberg. Most of Scott’s talk wasn’t on specific technique, but instead on dynamics of projects and general systems thinking skills.
Then, in the afternoon, we had a panel on innovations in performance testing, then a quick mini peer conference where we shared our experiences with perf testing.
In my book, the peer conference idea beats PowerPoint-driven lectures by a mile. Still, I have to say, Scott’s talk was pretty good.