By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
My son Nicholas learned a hard lesson today. Nicholas is 8 years old, and this is his first season playing tackle football in what's known as the rookie league. As it turns out, he is pretty good for an 8-year-old and has earned himself a starting spot on both the offensive and defensive teams. During today's game, Nicholas accounted for 115 offensive yards, recovered an onside kick and saved at least one sure touchdown by the other team with an open field tackle. These statistics are made even more impressive when you consider the fact that 8-year-olds play only 10-minute quarters and play on an 80-yard football field. So basically, my son had the best game of his, albeit short, football career.
So, what's the lesson? As it turns out, my son's team lost in overtime, and Nicholas was heartbroken. It's understandable for him to be sad that they lost after he had such a great game. But the lesson I'm talking about isn't really about winning and losing. The lesson came when the coach started debriefing the team. The coach said, "I know a few of you played very hard and had very good games today, but that's not what's important right now. What matters right now is that as a team, you played horribly."
Having attended the preseason games, more than a few practices and three other games this season, I have to admit that I agree with the coach. You see, the team had also lost almost as many yards in penalties as my son had gained due to mental errors and weak discipline. Some players paid so little attention that they executed the wrong play on the field, and two of the players on our team even got into a fight with each other on the field.
Now, I know these are 8-year-old boys who are bound to make mistakes, and they are playing football to learn and have a good time. Still, the coach was right. Even though there were some exceptions, the team as a whole simply played sloppily and self-centeredly compared to how I have seen them play previously.
What my son learned was that sometimes it doesn't matter how hard one boy tries or how well he does his job on the field; when the team doesn't come together and play as a team, the team rarely wins. It's a lesson I hope he remembers and applies throughout the rest of his life. Maybe this lesson will stick with him in a way that encourages him to remember to sit back and take stock of the team as a whole during the heat of the game in addition to noticing his individual performance. Maybe the next time the team isn't fully engaged, he will step up and encourage his friends to play harder. Maybe this is how good team leaders are born.
Testing as a team
What does my son's experience have to do with software performance testing? Quite a bit, really. Over the years I've watched most software development teams treat the performance tester as an individual unit, one who barely needs to interact with anyone on the team other than to get a few questions answered from time to time or to answer some questions about the content of his results. In fact, the performance tester frequently ends up spending his time far away from the rest of the team in an office where the load-generation equipment is housed. And he works mostly off-hours while the rest of the team is gone because the company doesn't allow significant loads to be generated on the corporate network during business hours.
This kind of situation frequently leads the performance tester to feel the same kind of frustration that my son felt. Over and over again, performance testers express to me how frustrating it is that, no matter how hard they work, no matter how much data they collect, or how accurate that data is, they feel as if they have virtually no impact on the actual performance of the application when it goes live. That is probably why my most frequently requested conference talk, since I first gave it in 2003, has been about building and managing a collaborative performance testing and tuning team effectively.
I'll admit, there are a lot of things that are just hard about performance testing, but one thing that shouldn't be hard is understanding that conducting performance testing in isolation, as opposed to collaboratively with the team, is simply a mistake.
It makes more sense to think about performance testing and tuning as a team sport in which the following takes place:
- The team works together collaboratively
- The team is informed and involved in the decision-making process about what to test next
More information on performance testing Understanding performance, load and stress testing
Developing an approach to performance testing
Performance testing in context
- The team generates ideas about how the performance tester can help isolate performance issues using the tools that are exclusively available to him
- The team finds out about performance issues in real time
- The team gets to work together to help find functional errors under load
- The members of the team who best understand the various components of the system under test get to participate in analyzing the results data in real time
- The performance tester gets to sit next to the developer during the tune-retest-retune-retest cycle (frequently shortening that cycle from days or weeks to hours)
Maybe I should rethink my conference talk about building and managing collaborative performance testing and tuning teams. Maybe the point would be better made using a football analogy, focusing on the reality that one person, or even a few people, doing a good job -- even doing an exceptional job -- is not enough. When everyone on the team does his job and works together to support one another the net result is almost always dramatically more desirable.
About the author: Scott Barber is the chief technologist of PerfTestPlus, vice president of operations and executive director of the Association for Software Testing and co-founder of the Workshop on Performance and Reliability.
Dig Deeper on Stress, Load and Software Performance Testing