As cloud computing continues to mature, one is hard pressed to identify a class of enterprise software that is not delivered and consumed as a service. Performance and load-based application testing, important parts of ALM, can be counted among these cloud offerings. Moving these functions to the cloud offers typical cloud benefits, most notably lowered capital and operational costs, and support for distributed development teams. But cloud-based testing also changes the way the tests themselves are performed. These changes come at a time when more and more organizations are looking at software as their competitive differentiator.
"Every enterprise is a software company, regardless of what they're vertical is. Many of them are building more lines of code than major software companies per year. Software is the competitive difference in what everyone is doing now," says Theresa Lanowitz, founder and analyst, voke.
One of the biggest challenges in application lifecycle management (ALM), according to Lanowitz, is performance. "Performance will make or break whether or not someone is going to use your app. If you think about the type of apps you use -- enterprise or personal apps -- performance is the determining factor, so make sure that performance is there and that you're able to test appropriately for performance."
This is especially true of Web and mobile applications. Fortunately, cloud-based performance and load testing tools make it easier than ever before to ensure that internal enterprise apps as well as external customer-facing applications can handle user demand. There are three characteristics of cloud-based testing services that change the way http and https applications are tested:
Testing at scale
Cloud-based testing providers offer a cost-effective means of testing applications at scale -- as opposed to a lab environment that simulates a small subset of the production environment. This means that instead of testing an application against a portion of users and extrapolating that data to scale with a production environment, the cloud-testing provider can test your application against the actual number of expected users. SOASTA, for example, offers CloudTest, a functional and performance testing service for Web and mobile applications. In the case of performance testing, SOASTA uses cloud servers to simulate traffic that would come from users visiting a website.
Similarly, cloud-based testing tools enable testing on a global scale, thereby reflecting the regions from which users are accessing the application. This is often done through partnerships with other cloud providers, such as Amazon and Rackspace. For example, Blitz by Mu Dynamics allows customers to run load tests constituting millions of concurrent users coming from multiple continents.
Testing production apps
In addition to testing test and stage applications, cloud-based testing tools can be used to test production applications. This is, according to Sven Hammar, founder and CEO of Apica, "where you have all the complexity, all the right servers, the right number of users, and you get more feedback on the problem." When testing in production, you're testing at maximum capacity, and different problems arise than those that are encountered at medium capacity. As a result, you get a more realistic picture of what can go wrong and the ability to make adjustments before problems occur with users.
Advice for using Software Testing as a Service
When it comes to using tools like SOASTA, Blitz and Apica, Lanowitz offers several recommendations. First off, she says, "When using a test tool in the cloud, make sure you understand how licensing is working. How are you going to pay that vendor for using that tool in the cloud? Understand what you're paying that tool vendor for and how your costs are going to be affected as you attempt to test for more users. Be aware of the hidden costs and be able to identify what your total cost is going to be."
Secondly, Lanowitz advises organizations to understand the software vendor's roadmap, including how they plan to put out different communications for the development lifecycle and how tests are reported. "Understand how to interpret, read and act on the advice from the tool," she says.
Finally, "Do a proof of concept when adopting a new tool," says Lanowitz. Determine the two or three tools that you think you might want to adopt and do a proof of concept on each one, looking at integration with other tools in use, how the tool works with your different platforms and, again, understanding the costs and how you'll be paying for them, she says.