WavebreakmediaMicro - Fotolia
Expert Matthew Heusser attended the Conference for the Association of Software Testing in August and is sharing five key takeaways for those of us unable to attend.
At the CAST 2017 conference, the opening keynote was delivered by Dave Snowden, creator of Cynefin, the complexity model that took the IT world by storm.
The most obvious insight for testing is that, by attempting to treat complicated or complex work as simple, we run a huge risk. Breaking down requirements, turning simple the user shall specifications into test cases (make sure the user can) increases the risk that a black swan -- a disruption that seems obvious in hindsight but is hard to predict upfront -- will sneak past the delivery team.
In this CAST 2017 conference presentation, Snowden made another point: Failure can force a growth experience. He mentioned that, in his simulations, when the participants go through failure, they start to scan more items for information. After three rounds of failure, participants are scanning 25 times more pieces of data than when they began.
People with success under their belt, on the other hand, now have an expectation that they will see more success. They might be more willing to take risks – as, in the past, there were risks, yet things turned out fine. Like a gambler who consistently wins, the bets can get larger and larger right up until the winning stop.
Lessons: People learn from failure, so engineer it. If possible, do it in a simulation or a low-risk environment. Do not get overly complacent.
Take a look at performance from the customer's perspective
Eric Proegler, context-driven test manager at Medidata Solutions in San Francisco, led a two-part CAST 2017 conference session on front-end performance. He taught attendees how to use Chrome's built-in developer tools, then went on to show webpagetest.org, a tool that shows page load times and render times and even produces a movie of what the page looks like as it loads. Test servers behind a firewall may need to install a data center version of the open source project that webpagetest.org runs.
Lessons: If scaling is an issue, it's better to be down than slow. Also, check out the speed of the website with a tool like WebPagetest.
Get the right testers on the bus
The second day of the CAST 2017 conference opened with Mary Thorn's presentation on Agile testing at scale. Thorn, director of Agile practices at Ipreo, talked about shifting from an environment with hundreds of thousands of test cases to hitting tight sprint timelines. To get there, she took a few concrete steps. First, she recognized that not everyone would make the leap -- every organization she worked with experienced about a 25% turnover rate during the transition to Agile testing. Referencing Jim Collins' book Good to Great, Thorn said companies need to get the right people on the bus -- hiring for attitude and aptitude first, because skill can be trained. Next, she recommended moving toward risk-based testing, replacing an attitude of "test all the things" with a time-constrained investigation.
Lessons: To ship more often, free the tester to decide what the best use of their time is. While you're at it, lower the documentation burden.
Teach yourself security testing
Chris Garcia and Marion Nepomuceno, security engineers at Hyland Software, led a hands-on session on penetration testing during the CAST 2017 conference. The session started with us installing a Linux virtual machine that runs on VirtualBox, then running the Juice Shop application and testing it with Burp Suite. All the tools are free and available online. Not only is there a free online tutorial to test Juice Shop with Burp Suite, there are even video tutorials on YouTube.
Lessons: A good tester can learn essential security tests to find problems faster in a half or full day. As a bonus, it's possible to learn from the convenience of your desk. Still, if you get a chance to attend an in-person session, take it.
Reduce risk on legacy code without testing
The great surprise of the CAST 2017 conference was a presentation by Tina Fletcher, president of the Kitchener Waterloo Software Quality Association and a senior test strategist at Desire 2 Learn (D2L). On the last day, Tina talked about the problem of large, complex environments, where a change in one place might cause an unexpected regression somewhere else.
Fletcher pointed out that it is impossible to "test quality into" a product, and that more testing leads to diminished returns. Instead of more testing, she proposed a series of small improvements, which you might call bronze bullets (instead of silver), that could add up to a big impact.
Those bronze bullets include code stewardship. That's a little different than ownership, where only one person is allowed to make changes to a module or code area. With stewardship, someone with expertise is assigned to mind the store. Stewards can answer questions, are involved in code review and otherwise can make sure that proposed changes are consistent with the design and goals of the module. At D2L, stewards don't "approve" changes as much as give input and insight into the process. Her next bullet is coverage analysis -- using tools to indicate the percentage of the codebase's statements that have been covered in regression testing. More important than the percentage to Fletcher is the gap -- what modules are undertested. If an undertested code module is the one with the change, Fletcher advised caution, increasing coverage and more testing.
In addition, Fletcher recommended working with additional groups outside the team, like other teams, support or customers, to see how the software is used in practice -- something she calls team shadowing. Finally, she mentioned culture change, specifically using the definition of done to prevent sloppy work. The right definition of done, according to Fletcher, is a get out of jail free card because the code can't go into production. After all, it is not done.
Lessons: Code stewards, coverage analysis, team shadowing and culture change might just add up to less risk. That's something I didn't expect to hear at a test conference.
Why you can't slack off when testing
Testers, how secure is your job?
At last year's CAST, it was all about automation