At the Software Test and Performance Conference (STPCon) this week, test pro Scott Barber kicked off a performance...
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
testing panel discussion with a car metaphor that addressed the session's first audience question: "Are there expectations I should have for performance testing tool vendors?"
Well, replied Barber, you buy a car with an advertised zero-to-60 mile-per-hour in 15-seconds acceleration rate. "That measurement of performance is based upon optimal testing done on an ideal surface, under perfect atmospheric, barometric circumstances and piloted by a professional race car driver. That advertised performance sells the vehicle, but should not be assumed an easily repeatable performance determiner. That number is achievable, but maybe not by you."
In other words, a software test manager can only determine the true quality and innovation of performance test tools by running those products in his or her own environment. That's the primary message that Barber and his panel mates -- James Bach, Dan Bartow, Ross Collard and Dan Downing – put to the packed house during this session. They shared their experiences about performance testing to illustrate this message, as well as providing advice on related topics. Barber is the CTO of PerfTestPlus and an author. Bartow is senior manager of Performance Engineering at Intuit, as well as a co-founder of ATG. Collard is as self-proclaimed software quality guru and founder of the consultancy Collard & Company. Downing has 28 years of technical and leadership experience as a programmer. He is also VP and GM of Testing Services at Mentora.
None of the speakers said not to buy products that didn't live up to all of the vendors' claims, because that approach would probably curtail your purchases to naught. Sometimes, Barber explained, you need to accept what is reasonable. It should be fairly easy to judge whether your purchased tool is running way below its potential, and that's not reasonable and cause for not using or buying the product.
"For the most part, software vendors are known to try to get test managers to sign contracts quickly. They don't really know that the product fits your environment. They don't' study the customer's test environment and make sure the product is a good fit. Essentially, vendors just take "stabs in the dark" regarding your code and application functions, Barber said. They should still really be relaying performance numbers closer to what a customer will realistically expect.
Pre-defining test criteria with the entire project team before choosing tools is absolutely a must, Bach said. Don't sign a vendor's contract without both parties knowing full-well that changes and amendments to the final code may be necessary. Collard added that this practice can stop conflicts before they happen. If conflicts arise, he said, be aware that "very few software client-vendor disputes make it to full trial. Often, they're settled out of court."
When the session shifted topics to innovation issues, Downing stressed that test managers can determine what is innovative by whether or not a tool provides a new and better ways to test, the panelists said. Downing pointed to one type of innovative product that's doing just that today: "Visualization tools allow for the measurement of performance traits and their monitoring capability and have taught us how to run lean and agile through graphs," he said. "I find that incredibly useful."
Downing also warned that the term, innovation, is too widely used by vendors. The terms, cooperation and collaboration, is a point in case. The rise of agile development has made cooperation and collaboration "new" vendor buzzwords today. Downing suggested that collaboration as outlined by agile is the renovation of an age-old, proven approach, and asked: "Can we really still call the concept of working alongside our peers as innovative?" Bach agreed: "Didn't we all learn and benefit from the idea since kindergarten?"
While the panelists agreed that test managers need to do due diligence before choosing performance testing products, they also praised software vendors and the open source development community for creating top-notch tools.