Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Q&A: Software tester describes daily application performance testing work

A software performance test engineer describes application performance testing processes and best practices. The tester answers performance testing FAQs and discusses API-level and load testing.

What do performance testers actually do? And how do they do it? Instead of going broad, I decided to dive deeper,...

telling the story of one specific performance test engineer, Cristina Lalley, and how she does her work. In this Q&A, I interview Lalley about the what, how and why of performance testing as she sees it.

Cristina Lalley is a lead performance test engineer with Thomson Reuters in their legal business. She works on Westlaw, a worldwide online legal research service for lawyers and legal professionals. The infrastructure on which Lalley focuses becomes the main tool for not only Westlaw, but also other Thomson Reuters' products, including related research products that her team supports.

Heusser: So you're air-dropped into a new project. The first features are ready for exploratory and load testing. In Agile terms, the first meaningful iteration is done. What now?

Cristina Lalley: Besides the obvious praying, the first thing I would do is schedule some time to interview the stakeholders; developers, designers, functional testers, content managers, etc. The goal of this would be to find out as much about the features, the project and the individual expectations for it. All of these groups can be very good resources for designing tests and putting some objectives and thresholds around those tests. This also will usually spawn discussion amongst the groups when they don't agree on the expectations.

I have developed a list of frequently-asked questions and use this to frame up the discussion for the meetings. I also find it helpful to do some exploratory testing just to be become familiar with the features and come up with some ideas as to where performance may be an issue from my experience with similar products or platforms.

Heusser: I agree that understanding the features of the product by interview is a great way to start. I'm curious if you have any specific techniques you generalize from project to project; any tips, techniques or strategies we can re-use? Tell me that those frequently-asked questions.

Lalley: Well, I get a login to the application and poke around and see what I can find. We also have some internal monitoring tools that are mainly used to diagnose production issues but come in handy for checking if there are existing issues that just haven't posed a big enough problem for users to complain. As for the FAQs (frequently asked questions)s, one of the best is: "What do you intend to accomplish with performance testing?" Goals might include response time, throughput, resource utilization (processor, memory, I/O) and user load.

Another good one is: "How do users access your application?" In other words: What browsers do they use? What speed of Internet connection is used? What percentage of users is using each type? These are usually questions the business hasn't asked, but the questions help start the discussions.

Heusser: Do you have any modeling tricks to help figure out how users will 'bounce' through the app?

Lalley: This is something we are really struggling with. We are using flowcharts to map the possible paths and then working with the business to determine which features are used "more." We then place percentages on each feature. Definitely not an exact science, but it is a start. We use something like Scott Barber's User Community Modeling Language.

Heusser: Do you mean you look at the current 'hot spots' and test them out on the next release candidate build?

Lalley: For something existing yes, these same homegrown tools also exist on the lower levels for development troubleshooting. The testers can access them, if they know where to look. Also each user interface is customized to the business unit, which means performance issues from how document rendering and js/css are used are bound to exist. I use YSlow and Fiddler to find them once I've determined there is a problem.

Heusser: So you're concerned about browser-rendering as well as server-side performance. If you had to split those up, what would the ratio be? 50/50? 30/70?

Lalley: 50/50 is about right for new development; for existing apps it's more like 30% browser rendering and 70% server.

Heusser: Okay, let's try to fast-forward that a bit. Can you quickly tell me about the app you are testing now -- what it does, how users interact with it, what happens if it breaks?

Lalley: I am currently working on performance testing of an infrastructure that services more than 20 applications worldwide, some of which have tens of thousands of concurrent users. This infrastructure provides the content storage and drives the search engine and various other content-related features for the applications and when it breaks users are unable to do basic functions, many of which are legal research tasks where time is critical. During this time users contact their respective call centers in many countries to report issues. These trouble tickets are then routed to our location where the data centers are hosted and SWAT teams are deployed. Yes, really; SWAT teams with pagers and teams of developers on standby.

Heusser: Okay, now how do you performance test it?

Lalley: We have written a test harness to exercise the code at the Java API level. We then use a commercial performance test tool to generate load and to execute scenarios.

Heusser: So you have a developer write a tool to simulate traffic, and you 'code' to that tool in a programming language?

Lalley: We use the commercial tool to simulate traffic, manage the load on the system and collect data around the test. We code in the proprietary language of that tool to write our scripts calling methods in our test harness. The harness takes care of middle tier actions and any other tasks that our test tool doesn't support.

Heusser: So the concern I would have is that the box making the calls gets overloaded. Do you scale the performance testing out over multiple machines?

Lalley: We do distribute the load across multiple agent machines when a test calls for it. Each tester carefully monitors CPU, memory and responsiveness to make sure it stays within thresholds. We have team guidelines for what's acceptable and the option to add a few more machines if necessary.

Heusser: How long would you say it took to develop the foundational infrastructure, to make it easy to write scripts against?

Lalley: We have been working for about a year on this new API level testing foundation. We were a new team so it took about six months to gather the general "how we do performance testing" on our SharePoint and wiki sites. It was another three to four months to train the team in Java, come up with a strategy and get our first working tests. We have been incrementally adding tests to the suite the last two months. The work we are doing now is more around building up the environment to mimic the infrastructure as closely as possible. We are expecting that to take some time. Our estimate is a year. For a team with established practices and dedicated resources I would expect that this could be reduced greatly.

Heusser: A year? Wow. I suspect that puts a little reality check on what professional performance testing actually costs. Now let's put that new performance tester hat on again. Say you're talking to someone asked to do performance testing for the first time, without a strong cloud or virtualization programming background. What advice would you give them?

Lalley: Good thing that hat is never far away! I personally learn well from reading so I purchased numerous books and read many blog posts. This gave me the basic knowledge to be able to ask good questions of the more technical people in the business. I found great people that were willing to explain the inner workings of our environments to me and answer my questions. I joined many forums and discussion groups and asked questions. There also may be established performance testers within your company that you can ask to mentor you.

Heusser: What's the strangest thing that's happened to you on a performance test project?

Lalley: I sat in a vice president's office watching him hand-time transactions with a stop watch. As he did this, I answered his questions to help him understand my test results, how they were generated and why the stakeholders were so unhappy with the performance of the product. While this was uncomfortable at the time, I never had to do this again with that particular VP. He saw that our results were accurate and factual. He also saw how easy it is for them to be misrepresented or misinterpreted when used out of context. I was impressed by the time he took to learn so that he could back up our work and its value.

Heusser: What's the most critical thing you've learned about performance testing that the textbooks seem to leave out?

Lalley: I would have to say the difficulty level as compared to functional testing. A performance tester needs to not only have awareness of the application under test but also the underlying systems and infrastructure. I have spent a lot of time building relationships at all levels; functional testing, application development, infrastructure development, database engineers and even content designers, in order to be able to as quickly as possible break down an issue and assist in finding the root cause.

This was last published in February 2010

Dig Deeper on Agile Software Development (Agile, Scrum, Extreme)

PRO+

Content

Find more PRO+ content and other member only offers, here.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchMicroservices

TheServerSide.com

SearchCloudApplications

SearchAWS

SearchBusinessAnalytics

SearchFinancialApplications

SearchHealthIT

Close