We purchased LoadRunner last year, and I hired 2 contractors to get it setup. I did not hire the right people for...
the job; and while we got some results, they were not communicated well. I then took one of the QC group's contractors, who had previous LoadRunner experience, and put him in this role.
Our client has his own performance-testing group and was willing to share so I got them in house. He insisted he needed to create his own scripts because it would be too difficult to try to figure their scripts out. I thought this may be because he had not been performance testing for a couple of years and was uncomfortable or did not know how to use anyone else's scripts.
I have now hired a new contractor who has been using LoadRunner for 10 years and has much experience in this area. I expected that we could continue on with what my current tester had created and build from there, but one of the first things he said to me was that he would need to create his own scripts! Is this normal for performance testers?
I don't know a lot about your application, so I'll speak in general terms when it comes to performance testing. It has been my experience that performance testers will prefer to rewrite scripts. If I'm joining a project as a performance tester, I know that many times it can be easier for me to recreate the scripts myself, rather than try to figure out what's there. That's not always true, but there are some fairly strong influences that make this the case.
With many tools including LoadRunner, there are a lot of configuration settings that come into play when you record and playback the script. If someone had incorrect settings when they recorded, it can make it difficult to maintain the script. Sometimes it completely invalidates the test because they are doing something incorrectly like clearing cache when they shouldn't be, managing sessions incorrectly or correlating data incorrectly. While these things could be fixed manually because performance test scripts are so large (thousands of lines of code in some cases) it is often faster and cheaper to just re-record the script.
In addition, for most performance testing tools typically generate very sloppy code. It's difficult to read and follow, often because you're looking at a "flat" representation of actions that happen in parallel. This only makes it more difficult for someone to go in and update a script that was recorded by someone else. And many times, performance testers will prefer to even re-record their own work if the changes needed are large enough.
There are a couple of things you can do to prevent this, but none of them solve the problem entirely. Recreating performance test scripts is unfortunately part of a performance testing for most web applications of any sort of complexity. The biggest thing that can help is to find performance testers who write their own tests instead of recording them. It's not always practical to do this, but if you have a test or set of tests that you feel should have a long shelf-life, custom coding instead of recording is often by far the better way to create an artifact that can be maintained. However, I will admit most performance testers aren't that great at this. It will likely require someone with significant experience or a strong dislike for automatically generated code.
Another thing that can help is to have your team develop some standards around recording and playback options and for maintaining performance test scripts. In addition, documenting the recorded test scenarios should go along with that. While this also won't completely solve the problem, it can help make scripts more accessible to people who might be new to the project. If they know what decisions were made and why, it makes it easier to read the code and understand the intent of the test. This can sometimes allow them to make small changes without needing to re-record the test script.
Related Q&A from Mike Kelly
Every software tool is individually designed to meet various needs and requirements of projects, teams and project managers. Learn what tools experts...continue reading
There are multiple ways performance testing can be handled on an Agile team. An expert describes the benefits of various approaches.continue reading
Creating user acceptance tests out of basic software requirements documents can be a daunting task. Expert Mike Kelly points out logical approaches ...continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.