On October 4th, the cloud-based performance test organization SOASTA announced its integration with Selenium, the popular open source testing tool. This will allow for functional and performance test results to be combined for enhanced analytics, as well as the ability for results to be fed back into continuous integration servers such as Hudson and Jenkins.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
But perhaps even more interesting for users are the extensions SOASTA has added to Selenium’s functionality, including a visual test creation environment that allows testers to create tests without traditional coding or scripting.
I spoke with Tal Broder, VP of Engineering at SOASTA, who said:
We have significantly enhanced the capability of Selenium in terms of recording the detection of what element was actually interacted with on the page. We added a visual test environment, which we already had for load testing, and we also enriched analytics. We believe with our offering, even though we are using all the power of Selenium for driving browsers, we have a much faster and easier test creation without having to write a single line of code.
This announcement comes on the heels of two other recent announcements in the ALM test tool market: Replay Solutions and Coverity. Both of these announcements also were about improved automated testing, enhanced analytics, ALM tool integration and feeding results back into a continuous integration tool. I asked Broder about what was driving this trend. He answered:
I think this is all driven by Agile and this whole DevOps movement where people want to build very, very often and release small chunks of code into the user community in a very fast and efficient way. They need the tools that will allow them to find bugs in an automated way and test the performance before you push it, or even after you push it, into production so that you can protect your users from functional problems, from performance problems, and I think that’s why we’re seeing a lot of automation in the industry. I think that trend will continue.
I asked Gartner analyst Tom Murphy to compare and contrast the three announcements. He explains that each tool catches bugs in different ways, but they are very complementary:
There are a number of different places to find defects or ways to find them. Coverity is focused on the analysis of source code to find defects, Replay is focused on identifying defects that occur while the application is running by capturing what is happening in the environment and SOASTA is just “reading” functional testing to their story line, which is a way to automate tests from a user perspective. So rather than looking at the source like Coverity, I use SOASTA (or say HP QTP, or others), to drive the application and monitor for deviations from the expected behavior. I use Replay Solutions while I run tests to provide a detailed recording to the developer so that when a defect is found they can identify what is going wrong faster (ie. I don’t have to reproduce the defect), and I can also use this in production to capture crash info, etc. Now with their most recent product, ReplayLightning, there is more of a connection between what is happening at execution time, which is kind of a Dynamic Source Analysis rather than the Static Analysis that Coverity and others provide. This is important because in dynamic languages there are things you don’t know until run time.
In short, the tools are all complementary to each other.