App performance monitoring closes gap between testing and production

QA consultant Gerie Owen discusses the role of application performance monitoring in meeting user expectations once an app is in production.

Proving an application meets the bare minimum of its requirements is no longer good enough. Today's software testers...

also have to make sure that the application in production meets the users' increasing performance needs and expectations.

Gerie Owen Gerie Owen

The question is how. How can testers, who have been charged with pre-production quality and verification responsibilities, make any predictions on how an application will perform in production? The answer is, application performance monitoring. This requires analytics, i.e., the ability to collect data on an application under test and use that data to understand the circumstances under which the application may fail.

This approach to testing is complex. It means that testers not only have to verify compliance with requirements, but also have to predict how the application will work in the production environment, and where it may fail in real use and under load.

Ultimately, the goal is to be able to understand an application's weaknesses and to be able to monitor that application in production to determine when those weaknesses might cause problems.

Ultimately, the goal is to be able to understand an application's weaknesses.

To deliver on this goal, testers today have to collect data on application performance and reliability below the GUI-level. Testers need component-level data on performance and reliability, as well as system-level response curves under user load.

System-level load characteristics tend to be easier to measure during testing. Testers can determine how many users it takes to break the application and how the response time varies based on the number of simultaneous users.

Testers can also dig deeper, using performance monitoring counters or other tools to look at how individual application components behave on a variety of measures, including memory use, CPU use and database access. Used together, these can provide information on which parts of the application are getting stressed as the user load increases.

As for correlating testing results with performance and behavior in production, an increasing number of DevOps teams are employing monitoring from the cloud, using services such as Compuware Gomez, Soasta mPulse or SmartBear AlertSite.

These services typically employ Real User Monitoring (RUM) to get accurate analytics on response times, HTTP and database errors, and other characteristics that can then be compared to synthetic testing results. RUM provides a valuable reality check on test results as well as predictive power on when an application in production may be in trouble. The same characteristics that indicate an application is straining during load testing are important to watch out for during actual use.

With or without collecting analytics, software testers must do some level of load testing, if only to determine whether or not it meets requirements. Collecting analytics allows testers to go beyond that and learn important things about the application that will facilitate decision making in production.

Testers may say that collecting data to make production decisions is outside of their responsibility and expertise. However, when they analyze the load and performance results in test, testers will quickly pick up on production issues that arise. Plus, finding and addressing performance problems is key to retaining customers, which is a software tester's core business value.

Moreover, as more and more organizations adopt Agile methodologies and challenge testers to transform their roles and responsibilities, app performance monitoring in production provides new opportunities. For all testers, it is critical to understand how and where they can add value to the application development and deployment process. By stepping up and showing that they can add this kind of value and develop new expertise if it doesn't already exist, testers can go where they have not gone before and solidify their leadership role in the Agile world.

This was last published in February 2014

Dig Deeper on Software Performance Management

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

8 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Are you already performance monitoring apps in production?
Cancel
I would like to know about monitoring perf of apps in productions
Cancel
We attempt to do this - we don't have a specific tool, but we rely on the domain knowledge of our team to predict what kind of issues an application might run into in production. We collect performance data, and make our best guess as to what kind of factors will affect the application in production.
Cancel
Our apps are mostly tested in the field on the job. We run through lots of useless programs very quickly to find the ones that can do the jobs we need. Perform or die. Fortunately, we have a core of apps that have been tested and proven.
Cancel
Our load/performance team works closely with the development teams to hammer out any issues before we reach production, which provides Ops a good baseline performance measure to compare against the production performance data that they gather.
Cancel
This is a tough one for me and my team. We try to predict what kind of performance issues an application might encounter in production, but it's at best a good guess. Sometimes, there turn out to be issues in production that didn't exist in the QA environment, and we just didn't think of them.

I haven't heard of Gomez or the other tools mentioned, but I'll have to check them out and see if they could be helpful for our testing purposes.
Cancel
We’ve been using Gomez for a while now with great success. I highly recommend that you look into it. We were interested in SOASTA mPulse, but were unable to swing the financials to make it happen.
Cancel
Great article. Your readers might find it helpful to look at vendor neutral reviews of all the top APM tools on IT Central Station. Many users mirror what you have said - that they feel the priority is being able to pinpoint the weakness and when they might occur before they do. 

Currently the top rated solution in this category is CA's APM: https://goo.gl/W8ASfe. This Sr Software Systems Engineer writes, "The most valuable feature of APM is gaining insight into the application performance so we can proactively take actions before the customer calls to report an issue." To see the rest of his review plus others click here: https://goo.gl/ue106b. 

Hope this is helpful.
Cancel

-ADS BY GOOGLE

SearchSOA

TheServerSide

SearchCloudApplications

SearchAWS

SearchBusinessAnalytics

SearchFinancialApplications

SearchHealthIT

Close