FAQ

FAQ: Automated software testing basics

Most QA pros believe automated software testing has value. But questions about which tests to automate, which ones should remain manual and what pitfalls to watch out for often stand in the way of getting started.

Software testing -- assuring the quality of applications before they are released into production -- revolves around three factors: cost, quality and time, according John Scarpino, a career QA director and adjunct professor at Robert Morris University, in Pittsburgh. "Cost, time and quality are not independent. You cannot be successful with one unless you are successful with the other two."

The goal, of course, is always to decrease the cost and time software testing requires, while maintaining application quality. Automated software testing can help QA pros move closer to that goal -- provided they understand how to proceed and what to expect from these challenging projects.

This FAQ tells QA professionals what they need to know to get started with automated software testing.

What is automated software testing?

Automated software testing is an alternative to manual testing, where software tools, not human testers, execute pre-scripted tests on a software application before it is released into production.

Using automated testing tools is a bad idea if you're not yet expert at testing.

Automation tools enable testing organizations to run tests quickly and repeatedly. The tools manage test execution, report outcomes and compare results with earlier test runs.

What drives test automation projects?

Most organizations that undertake automated testing projects do so because their current approach has fallen short in some way. According to test architect David W. Johnson, decisions to automate software testing are typically a response to failure to complete testing in the assigned timeframe, or cases where quality goals have not been met and too many defects have made their way into production. In "Test automation: When, how and how much," Johnson notes that when these goals are not met, the test organization is not necessarily the true cause of the bottleneck. When developers fall behind on software projects, they use up valuable time initially allocated for testing.

Whatever the cause, automated software testing projects get underway when the test organization has essentially reached its capacity. At this point, management has two alternatives: grow the manual testing team or begin evaluating software test automation tools.

Which tests should be automated first?

Agile test expert Lisa Crispin recommends starting with unit tests, because they produce the highest return on investment. Unit tests -- where small units of application code are tested to make sure they work properly -- are the least expensive to write and maintain, and they provide value to the team multiple times per day. Using them to drive coding in test-driven development (TDD) helps programmers think through and verify their code design, Crispin said. In "Agile methodology techniques: Unit test automation and test-driven development," test expert Yvette Francino provides a good example of how a unit test is automated.

Mike Cohn's test automation pyramid.

Mike Cohn's test automation pyramid.

Francino and Crispin, among many other test experts, believe Mike Cohn's test automation pyramid provides a good guideline for determining which tests to automate first. Unit tests form the solid base and largest part of our test-automation triangle, and it's widely agreed that they are the place to start.

Also well suited to automation is performance testing, where testers must simulate hundreds or even thousands of concurrent users, said SearchSoftwareQuality.com expert John Overbaugh. In this situation, executing the test manually is very difficult.

What kinds of tests are better done manually?

While some aspects of user interface testing can automated, no automated test in the world can validate whether the application actually looks good, said Chris McMahon. That calls for manual exploratory testing where professional testers, not software tools, comb through an application taking a context-driven approach.

McMahon recommends automated UI testing for simple things such as checking that users have access to page elements they need. But he cautions that well-designed automated UI tests are shallow and should be conducted hand in hand with manual exploratory testing. UI tests are merely indicators that something in the software may not be correct. It takes human beings to evaluate the extent of any potential problem, he said.

Test expert Vasudeva Naidu believes about 70% of software testing can be successfully automated, while 30% should remain manual. He said it's too early in the technological revolution to replace manual testing completely with automation. In fact, most of the new features, complex validations and business intensive functions will continue to be tested manually. The goal of 100% automation is not just ambitious, it is also impractical, he said.

What are some pitfalls to watch out for when you move to automated testing?

The first thing to understand, said Crispin, is that automating tests involves writing code just as software development does. Testers must write scripts that tell the testing tools what to do. "It must be done with the same care and thought that goes into writing production code," she said. Failure to understand this leads to failed automation projects, where a company buys test automation software and simply waits for the magic to happen, she added. Test automation succeeds only when the whole development team treats it as an integral part of software development, Crispin said. "It's not any harder or easier than writing production code. Learning to do it well takes time, effort and lots of small experiments."

Also crucial to understand is that using automated testing tools is a bad idea if you're not yet expert at testing. "I see a lot of teams focus on automation, as a cost-lowering technique, when they really need to focus first on testing the right things," said test expert Mike Kelly. Once a team has gained expertise in managing testing risk and test coverage, and applying the right testing techniques, then talking about automation makes sense, he said.

How do teams ensure continued test automation success?

Automated software testing projects don't end when teams have mastered the process, said Agile test expert Bob Galen. One thing he sees too often is test organizations that continue to run the same automated testing scripts over and over again. The ability to easily repeat tests is of course a key benefit of test automation. But running more tests, faster, does not produce better software. Better software is the result of running the right tests and continually re-evaluating which tests are the right ones, he said.

Tell us about your experience moving to automated software testing.


This was first published in February 2013

There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: