Companies that move to cloud testing often arrive at a more streamlined and communicative application lifecycle, said Theresa Lanowitz, analyst with Voke Inc. Where developers and testers communicate, she said, there is often a lot of back-and-forth in describing defects found only in test environments.
"Testers are now able to say, we've tested this software in virtual environment in the cloud, here's the defect and here's a link," said Lanowitz. "And the developer can click on that URL, see where the defect is and fix it."
Eliminating the "back-and-forth tension" between developers and testers speeds up the test cycle and lets testers focus on more strategic work, Lanowitz said.
Retailer uses cloud to speed up testing
Startup M-Dot Inc. is preparing to roll out its digital coupon and point-of-sale applications at 200 grocery stores in August, M-Dot CTO Mike Kavis already knows what 1 million concurrent users would do to its in-house systems. "In the on-premise world you could never afford to do that," he said. "Even if you could, to set up and care for those servers could take months and months."
For Kavis, the ability to spin up a test lab at will and scale it to nearly any size has enabled a much more flexible style of application testing. He said his team can start testing as early as the prototyping phase and continue to test more frequently as it creates more services and optimizes the applications. The underlying goal is to root out bugs before they get into production. In the past, he said, load testing to scale was rarely even feasible.
"Where I came from, there wasn't a whole lot of money for hardware and testing, so basically you tested within the limitations of your own infrastructure," Kavis said. "We would always find errors in production because we just didn't have the horsepower to really pound at it."
Virtual labs can liberate testers from IT
Cloud testing gives QA managers control over their test labs, said Ron Yun, QA director at mortgage software vendor Ellie Mae Inc.
Before setting up an on-demand, virtual test lab, QA teams at Ellie Mae had to rely on the company's IT department to schedule and configure internal machines. But IT's top priorities were usually production issues. Often times, test projects had to wait. ¬¬
"Our request, which is an internal request to them, comes at the bottom of their priority list," said Yun. "It could take days to weeks to satisfy a request."
Last summer the company signed up with virtual lab provider Skytap Inc. and, suddenly, the test lab was available on demand from any workstation and could scale up or down as needed. Now IT is virtually out of the picture and QA teams can carry out their functional, load and performance tests without having to fuss over physical machines, said Yun.
Yun's testers can now collaborate with a second team in Beijing on a keystroke-by-keystroke level of real time. Before, there could be as much as a day between the offshore team sending some data and screenshots and a problem really getting addressed, Yun said. He estimated Ellie Mae's testing operations are now five times faster.
Getting IT's attention and finding enough hardware is a problem facing testers at many companies, said Tom Lounibos, president and CEO of SOASTA. And that holds back production.
"The testing of applications over the last 10 years has been a very slow process," said Lounibos. "There were a lot of people involved and it could take four to eight weeks just getting the hardware and setting up the tests." In a cloud testing environment there are often intelligent agents that log user actions to generate test scripts, he continued, and testers can spin up their configurations of choice in minutes.
Most of SOASTA's customers have been companies with consumer-facing Web applications. Now that many applications handle sudden traffic from online marketing campaigns and are connected to social media websites like Facebook, unprecedented traffic spikes can cripple an application without warning.
Large-scale testing not always necessary
Not all applications are equal, however, and neither are their needs in the test lab. While M-Dot's application is designed for unpredictable load spikes, JetBlue's website has fairly predictable traffic, said Sagi Varghese, QA manager at JetBlue. He said people can get mixed up about the need to load test at a tremendous scale.
"If a system is designed to handle X number of concurrent connections then, theoretically, you can only run that many concurrent virtual users," said Varghese. "So if the system is built to handle only 500 concurrent connections, if you give it a million hits it will process the first 500 and put everything else into a queue."
JetBlue QA teams use HP LoadRunner on internal machines, because LoadRunner and is able to support a wide variety of protocols, Varghese said. He'll be evaluating LoadRunner in the Cloud when it comes out soon.
The biggest frustration facing QA teams at JetBlue is trying to simulate real-life scenarios on new systems, Varghese said. With no historical load data to draw on in a new system, he said, the company's licenses for 1,200 simulated users can only go so far.
"Where the cloud comes in is the need to scale the size of the simulations up and down," Varghese said. "That is very hard to do with fixed assets."