Before performance test tools were available, testers measured system performance under load with a stopwatch and...
lots of hardware and people. Then came performance test tools that simulated actual users and hardware. Traditionally, these tools have been priced on the high side because of the value they add in measuring exactly how much load a system or application can handle in given configurations. Plus, there is great value in not having to have actual hardware and people in place.
Since system performance is a function of many things, such as code, networks, hardware (client configuration, memory, CPU speed, storage capacity, etc.) and databases, it is almost impossible to predict exactly where a performance failure might occur. Performance tools allow you to take applications to their point of failure by using virtual users.
Performance testing must be precise to simulate concurrent load at exactly the same instant. With manual performance testing, human reaction times simply can't achieve and maintain the same level of concurrency as seen with automated tools.
The business case for performance test tools is pretty straightforward, especially for major e-commerce Web sites. The cost of not being available to customers for even one hour can be over a million dollars. This level of risk can easily justify the investment in a robust commercial load and performance test tool.
However, there are some situations where the more robust and costly performance test tools are not as feasible. For example, if you want to gain experience in applying a load test tool and don't have access to such tools, you may find it a challenge just to gain access to a load test tool. Or, you may be a start-up company with a close eye on expenses.
High-end commercial tools
The undisputed leader in performance test tools is LoadRunner by HP. At over 75% market-share, this is the commercial tool I hear most often used by my clients and students. This is a great tool with robust functionality. Also in this category of tools are QA Load by Compuware, SilkPerfomer by Borland/MicroFocus and Rational Performance Test Suite by IBM/Rational. All of these tools provide the ability to simulate large numbers of virtual users on various simulated hardware configurations as well as monitor and report the results graphically.
These tools have a licensing model based on virtual users in blocks, such as 25 virtual users. Lesser-expensive commercial tools
The tool I use in my classes on performance testing is WAPT 6. WAPT also has high ease of use and allows unlimited virtual users. Pricing starts at $350 for single user licenses and reduce to $280 each for 5 users and $210 each for 20 or more users.
One other notable mention is JBlitz Professional 5.1 which costs $99 for a site license and allows up to 250,000 virtual users.
Free performance test tools
The best source I have found for open source testing tools is opensourcetesting.org. In the performance tools section, there are presently 41 tools listed.
The free tool I hear mentioned most often is Apache JMeter. Originally designed for Web load testing, Apache JMeter allows you to test other things as well, such as services, database queries and objects.
Another tool that is commonly mentioned is TestMaker by PushToTest (www.pushtotest.com). This tool is big in testing service-oriented architectures (SOA), but can also be used for load testing Web applications.
Speaking of SOA testing, SoapUI from has a very nice performance test feature for SOA in both the free and pro versions. The free version allows you to perform basic load testing of web services. The pro version is only $349 for a one-year license. I use this tool to teach performance testing in my SOA Testing course because of its ease of use. Plus, I like for people to be aware of the tool.
Not to be left off my list, WebLoad Open Source is also worth evaluating. Developed by RadView Software, this tool has over 113,000 downloads.
I encourage you to visit and browse through the entire list of open source performance testing tools.
However, corporations are a different story. Many companies have prohibitions against using open-source or free tools. In addition, it is common to find stringent criteria for obtaining and using tools from smaller companies because of security and/or business viability concerns. Personally, I believe both of these reasons are blown out of proportion. The bottom line is, though, you will likely need to build a business case for using free and cheap tools based in the value they add as compared to using no tool at all.
Keep in mind that performance testing is a snapshot. A performance test may reveal new performance issues due to any number of changes that may have occurred since the last test. For this reason, I typically advise against short-term leases of a tool. I believe load testing should be performed as part of regression testing.
If performance testing were just a matter of using a tool, everyone would be doing a great job of it. Almost all specialists in performance testing will tell you, however, that before you can apply the tool you first must understand the load and performance profiles. Specifically, you need to know:
- How much traffic (load) do you typically see on your site or application? (In fact, you need to know this before you know what size license you need.)
- What is that traffic doing? (Some may be just browsing; others may be engaged in major activity. So, just because you have 5,000 users on average does not necessarily mean you need to license that many virtual users.)
- What is the process being performed? (Some processes are designed so poorly they actually increase performance problems.)
- Where does that load travel on your site or application? (Specific functions, Web pages, etc.)
- How much growth in load do you expect to see in the foreseeable future?
- What are the performance requirements? (This one is especially tricky to define and get agreement about)
While opinions will differ on which tool is best, the one thing I know is that I would rather be using a free or lesser-expensive tool than no tool at all. Another way to justify the lesser-expensive tools is that they provide a good learning lab before spending huge amounts of money on a tool that may not work out due to lack of adoption, lack of understanding performance test design, or other reasons.
As with any tool acquisition effort, you need to perform a good evaluation and proof-of-concept based on your own stated requirements. It's interesting to include all ranges of tools in your evaluation and don't forget to factor in the intangible aspects such as suppor
Armed with this information, you will be well on the way to performance testing your applications in reliable and repeatable ways.
About the author: Randy Rice is a leading author, speaker and consultant in the field of software testing and software quality. He has over 30 years experience building and testing mission-critical projects in a variety of environments and has authored over 60 training courses in software testing and software engineering. He is a popular speaker at international conferences on software testing and is also publisher of The Software Quality Advisor newsletter. He is co-author of the book, Surviving the Top Ten Challenges of Software Testing, published by Dorset House Publishing Co.