Sometimes you have to see software fail to understand the importance of continuous testing. Dan Bartow, VP of product management at Mountain View, Calif.-based SOASTA, had this realization back in 2007 when he did work for TurboTax Online. Their system crashed on April 15th which, unfortunately for them, is the tax filing deadline. "I had just joined the company and there I was watching our CEO on CNN explaining to millions of Americans why, for the first time in U.S. history, the IRS extended the tax filing deadline."
Soon thereafter, Bartow understood the root cause of the company's performance problems. It happened to be the same problem that threatened all Web companies. "Their websites were running on massive architectures in production. We're talking hundreds or thousands of servers, multiple data centers, and so on, yet every company I had worked with tested their apps in a scaled-down 'performance lab.'" It occurred to Bartow that if you want to test realistically, you have to test software in production.
Today is where we are seeing the most rapid change in thinking around testing.
Dan Bartow, SOASTA
These days, with the prevalence of mobile apps, not to mention heightened user expectations and accelerated release cycles, continuous testing has become more widely accepted. According to Bartow, however, it is still not widely practiced. He had a presentation at STPCon 2013 that explored how continuous testing can optimize mobile performance and why more people should be doing it.
Continuous testing gains ground
In 2006, Amazon announced its revenue increased by 1% for every 100 milliseconds shaved off response time. This alone should be incentive for organizations to prioritize optimum mobile performance. However, the more pressing incentive is the hidden alternative to this finding: For every 100 milliseconds added to response time, revenue decreases by 1%. "Rapid release cycles have been demanded by consumers and the industry as a whole," Bartow said. "Look at how often apps you use on your mobile device are updated with a new version." In fact, the newest version of iOS has added an operating feature that updates apps in the background, uninitiated by the user. "No one waits till the end of a 30-day cycle of developing and testing anymore to release updated versions of their apps."
This means testing needs to be continuous in order to be effective or, put another way, performance testing needs to be part of the continuous integration (CI) process. "CI servers can drastically speed up time-to-market by automating things in performance testing that have been time-consuming and challenging to do in the past." Bartow offered Jenkins as an example, an open platform that schedules any task you want automated and then provides results for analysis, such as response times and failures. "We need that kind of proactive approach in mobile performance in the coming years."
But not enough
Bartow believes there is still a long way to go to get customer experience to where it needs to be. "Fundamentally, companies still have the same core testing problems they've had for ten years -- they wait until the last minute to performance test (if they do at all) and they still don't do it comprehensively enough." He put it this way: Ask a room of executives to raise their hands if they think mobile app performance is important, and all hands go up. Ask them to raise their hands if they're happy with their performance engineering operation; no hands. According to Bartow, development teams are no doubt already using a CI server to automate, build, test and deploy. Testing teams just need to fully commit to the process. "They can start with simple test scripts that smoke test the performance of key app functionality and start assessing that performance improvement or degradation over time."
That said, his outlook for the future is optimistic. "Today is where we are seeing the most rapid change in thinking around testing." He predicted that within four years about 90% of those surveyed will be kicking off their performance tests through a CI server on a regular basis because they won't have a choice. Until then, there is still progress to be made. "I found that only about 5% to 10% of people I talk to have a performance test kicked off as part of their CI process for Web or mobile apps." The intention is there, it's just the get-up-and-go that's missing, and understandably so. It's not easy to get people on board, change operations, shift priorities. Bartow's advice? "They just need to start doing it -- right now."