Should a performance engineer be part of an Agile team? If so, should he/she need to participate equally in every...
iteration? Or should a performance engineer spend a short time on many projects?
Performance testing presents a quandary for many Agile teams. For many applications, performance is the most important quality. For example, if a search takes too long to return results, users won’t wait to see that it returns really good results. We often need to do performance testing before functional testing. Yet, even cross-functional teams may lack expertise in performance testing.
I’ve seen larger Agile organizations try different experiments with regard to performance, stability, reliability, stress and similar types of testing. Some development departments have a few performance testing specialists, but if there are 25 Scrum teams, these specialists quickly get spread too thin. One approach is to try to plan ahead for when a project will need performance testing, and schedule a specialist to join the team just for that period of testing. When performance testing is key and needed often, this approach won’t work.
Another approach is to have performance test specialists rotate through the different development teams and teach them the skills they need to do the performance testing themselves. It’s the old “teach a man to fish” idea.
People may be unnecessarily intimidated by performance testing. Often, it’s good enough for the programmers to create a harness to run unit tests in many threads, simulating a big load on the system.
My own small team had no chance of getting a performance testing specialist. We decided to start by evaluating different performance testing tools to find and implement one that worked for us. We had a user story to research available tools and narrow down to a couple of possibilities, then we had a user story to evaluate, choose and implement a tool. We had another story to get a baseline of performance using our staging environment. Now we had a tool, test scripts, and a baseline, but our staging environment wasn’t enough like production to make it useful for performance testing. We had to ask the business for new hardware and software to create a performance test environment that was similar enough to production. The executives dragged their feet on this project until one day when enough users ran a slow-performing report that our system crashed!
When I worked for an internet retail company, we could not possibly afford the hardware and software to simulate a production-size load of users. We contracted with a performance testing provider to do the testing on our actual production environment. We only needed to do this once per year, unless we made a major architectural change that might affect performance. Use monitoring and profiling tools in production to keep an eye on performance, and take action to do some testing and tuning if you see red flags.
Experiment with different ideas and work in small increments to find the performance testing solution that works for you. If the examples I’ve given don’t work in your situation, you may need to hire a performance test specialist full-time. Like other Agile team members, that person should be willing to do other software development activities as needed.
Dig Deeper on Agile Software Development (Agile, Scrum, Extreme)
Related Q&A from Lisa Crispin
Agile leader Lisa Crispin explains a more organic, more Agile approach to test reporting. Continue Reading
When it comes to Agile planning, average time over many iterations is a more important metric than individual story estimates. Continue Reading
Most inexperienced Scrum teams overcommit on what they will deliver, and when. Agile leader Lisa Crispin says that does more harm than good. Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.