Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Identify performance problems in Java applications

Software testers trying to identify performance problems in Java apps might be using the wrong requirements and tools. Two experts suggest ways to accurately pinpoint problems.

Software testers trying to identify hidden performance problems in Java applications might be using the wrong requirements...

and the wrong tools, argued Gil Tene, CEO of Azul Systems, a real-time Java Virtual Machine vendor. Requirements specified in terms of statistical averages, he said, tend to focus engineers on the wrong problem. This problem is confounded by weaknesses in most performance measurement tools when assessing the worst-case performance of applications.

From a requirements perspective, most organizations tend to specify some statistical level of performance, such as 99.99% availability. Engineers then optimize the Java code to meet these specifications, without taking into account what occurs in the worst cases. As a result, Tene often sees significant decreases in application performance at specific statistical intervals, such as 99%, 99.99% and 99.9999% of average performance.

Tene noted that years of assessing what happens in the worst cases has shown that performance can be hundreds or thousands of times worse than acceptable performance. These problems are confounded when engineers in the rush to improve average performance don't take into account processes like garbage collection, database reindexing and virtual machine overhead. Often, in the worst cases, the result is decreased performance.

Virtual and cloud-based labs make it easier for testers to simulate loads from a distributed cluster of servers.

Another problem is that common load testing tools, like Apache JMeter, do a poor job of characterizing worst-case performance scenarios. Tene has found that when applications pause, these tools stop identifying the magnitude of performance degradation during the stall. This leads to a mismatch between test log results and real-world performance. As an example, a real-world stall of 26 seconds, as measured by JMeter, mischaracterizes application performance by a factor of 1,000 times.

To address such problems, Tene advocates that organizations always specify performance service level agreements by the worst-case scenarios. Tene also advises testers and developers to pay closer attention to what happens in worst cases by stretching out the display of performance testing data using tools like the open-source HdrHistogram, which he helped develop.

Recalibrating test and real-world environments

At least some of the performance testing challenges identified could be addressed by using virtual and cloud-based testing labs, argued Theresa Lanowitz, senior analyst at voke media, an analyst firm. Lanowitz recently released a report on the productivity and return on investment (ROI) benefits of virtual and cloud-based software testing labs. These labs make it easier for enterprises to set up testing that simulates real-world application performance.

For example, a load testing tool implemented on a single server has difficulty characterizing real-world performance from loads generated by masses of Internet users. Virtual and cloud-based labs make it easier for testers to simulate loads from a distributed cluster of servers.

Organizations see significant boosts in application delivery speed and higher software quality using virtual and cloud-based labs, said Lanowitz. She recommends that organizations consider adopting cloud tools.

Given the availability of virtual or cloud-based labs, Lanowitz said it is surprising that people are still talking about hardware issues. A developer might work on double-core machines for creating single-threaded applications, rather than multi-threaded applications. A virtual lab can simulate the physical hardware in this way. In essence, this virtualizes the hardware environment the application will be compiled and run on as required. She noted, "This eliminates the friction of back and forth communication between developers and testers."

Next Steps

Use app development time efficiently with Java inheritance

Check out these performance monitoring tools for Java apps

Java application development in the cloud

This was last published in November 2014

PRO+

Content

Find more PRO+ content and other member only offers, here.

Essential Guide

JavaOne 2014: Takeaways from Java's biggest conference

Join the conversation

3 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Do you agree that tools hinder performance of Java applications?
Cancel
If anything, using third party apps and tools with Java complicates the maintenance of a language that should be easy and user friendly.
Cancel
This is an interesting discussion, and he may have a point about trying to meet the 5 9s metric or something like that.  What I'm not clear on is what he expects us to do about it.  He throws an example of something JMeter doesn't do, and apparently expects the tool to be intelligent enough to tell him something, which might be inferred by a human using the application.  (It might not also.)  I find I am more confused about this topic now then before I read it :/
Cancel

-ADS BY GOOGLE

SearchMicroservices

TheServerSide.com

SearchCloudApplications

SearchAWS

SearchBusinessAnalytics

SearchFinancialApplications

SearchHealthIT

DevOpsAgenda

Close