Application performance testing at the best of times is not for the fainthearted. Adding a low budget and small...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
staff makes it very hard to achieve the goals of performance testing. Unfortunately that is exactly what many testers face in start-ups, small businesses and even in downsized test teams for larger development companies . Recently, I interviewed veteran software testers Michael Feerick, Trish Khoo and Evan Phelan about how they cope with this problem and came to these conclusions about best practices for performance testing on a tight budget.
Do you choose to accept the mission?
Before accepting to perform a testing task, take a hard look at the consequences of getting it all wrong. If performance testing is strategically linked to the success of the project, speak now to stakeholders. As Trish Khoo advises: "Be honest and upfront. Explain to them performance testing is a specialized skill. There is a high risk of providing misleading and inaccurate results."
Know your mission
Let's say that you tried the first tip, and you've still been told to "give it your best shot." It's time to work out what do you and your business and technical stakeholders want to get out of the performance testing. After all, there are many types of performance testing such as load testing, stress testing, endurance Testing (soak), spike testing, scalability testing, and baseline or benchmark testing. Which tests best cover the requirements of the stakeholders? Which ones can you skip without compromising stakeholders' goals?
Identify the types of performance testing what will and will not be performed. Personally, I would recommend starting with a simple and easy goal, such as benchmarking current performance, but your stakeholders may have different ideas. Point out the benefits of getting data that you know you can rely on, as opposed to results that may be misleading.
Ask yourself and your stakeholders are there other ways of monitoring you applications performance? Karen N. Johnson's great tip on Web application testing and monitoring provides some good how-to advice on application performance testing, too.
Pilot your performance test
All testers interviewed agree that careful planning and strategy is a must. To come up with a good plan, it may be helpful to run a small pilot and performance test on one simple scenario from your application. The pilot test benefits you in a few ways:
- It will help familiarise yourself with the performance test tool;
- It will give you an understanding of some of key issues involved in performance testing; and,
- It will help estimate how much time and effort is required to complete a task.
With a low budget, opt for tesing with open source tools like OpenSTA or Jmeter. WebLoad is another option, but the open source version has very restricted functionality. These tools are not always user friendly; but they do provide a good, low-cost alternative. Another low budget alternative is WAPT, which is user friendly, has good documentation and produces some nice charts.
Once the pilot is complete go back, plan and strategise you testing.
For more information on planning, check out Mike Kelly's tip See Mike D Kelly article on problem areas before performance testing.
Be a sleuth
It's time to do some serious investigative work exactly you are going to test and what data you are going to need. You and your business and technical stakeholders will need to come to an agreement on the following points:
- The scenarios and the associated test data you are going to use to test the application. These will be the basis of your performance test scripts.
- How many concurrent users or calls do you want per scenario?
- If you have any current usage data, use them to create your projections. If not, use stakeholder feedback to create some data models.
- How much pre-existing test data do you want?
Getting this information will help you know what test data needs to sit in the system before you start running your scripts. Unfortunately, getting this test data into your system can be a time-consuming task and sometimes can be more effort than performance test scripting or execution.
Timing is everything
Starting performance testing as early as possible is an absolute must! Even if the application is not fully completed, why not try your pilot on some functionality that is? This will give you a real head start into understanding the problems you'll be facing in performance testing on a particular project.
It's critical to think about when the tests will take place and who and how you are going to support them. I will never forget my first performance testing exercise where I had booked the whole lab for a weekend run. I started running the scripts at four o'clock on a Friday afternoon and headed off for the weekend. I walked in on Monday morning and was dismayed to find out that the scripts had failed 30 minutes after I kicked them off, and absolutely no testing had occurred at all on the weekend.
Know the territory
Create a performance test environment that mirrors the production environment in terms of hardware, data volumes and the number of virtual users. If the test environment required is large and costly, stakeholders often balk at this upfront cost. In this case, a scaled down version of the test environment can be used, from which you could extrapolate data. Another option would be rent some space for the duration of the testing.
If your company uses virtualization -- running applications on virtual machines in the data center or development lab, for instance -- or outsources any testing or development, then it may be easier to provision the type of production system you need. A good resource for outsourcing testing is Charlie Weblien blog post on testing from the cloud.
Test debriefing and a reality check
By the completion of performance testing, you will probably have large amounts of test data requiring interpretation and presentation. The key to success here is to pitch this final report to the intended audience.
Michael Feerick gives this good advice on how to talk to non-IT executives: "A technical report will not benefit a non-technical audience, so it's important to communicate your findings at the appropriate technical level to match your audience and indicate that the full test data results are available if required/requested."
Finally, be realistic about what you can accomplish with a limited budget and staff. It takes plenty of practise and failure to become proficient in performance testing, and it's a real art to be able to correctly interpret the resulting data. If you're doing performance testing with limited resources, you and your stakeholders have to scale back expectations.
About the author: Anne-Marie Charrett is a professional software tester and runs her own company, Testing Times. An electronic engineer by trade, software testing chose her when in 1990 she started conformance testing against European standards. She was hooked and has been testing since then. She enjoys working with innovative and creative people, which has led her to specialise working for start-ups and incubators. She's a keen blogger and hosts her own blog, called Maverick Tester.
Dig Deeper on Stress, Load and Software Performance Testing