How to get started with automation test scripts Weigh test automation's advantages and disadvantages

10 important automated testing best practices to implement

QA and test pros give advice on how to craft software test automation strategies that can speed app deployment. They also share their criteria for choosing automated test tools.

Many enterprises invest in automated app testing tools with high expectations and end up with half-realized goals and wasted effort. Instead, approach automation as a software development project, with traditional requirements, and prioritize adoption for tests by ROI.

Software tester requests for automation tools don't fall on deaf ears. After all, an automated app testing tool that saves an hour per run offers tangible cost savings.

"Enterprise execs realize that testers can't handle today's continuous delivery workloads with manual testing," said Paul Grizzaffi, principal automation architect for IT consulting firm Magenic Inc. "Automation is the judicious use of technology to help testers do their jobs."

While getting funds for automated software testing is not a huge hurdle today, effective implementation is. Grizzaffi, along with John Spitta, senior quality assurance analyst at FISMobile, and Tiffany Scott, senior test automation engineer for Cerner, have accrued several key automated testing best practices and tool selection tips.

10 best practices for automated testing

1. Align your test automation strategy with IT, business and development plans. DevOps provides an environment to collaborate with those parties. "Without DevOps guidelines, we would be building new test environments, guessing at business targets, ops compatibility and so on," Spitta said. "DevOps provides context for testing and shows us which tests to automate for the most value add."

2. Start small with test automation pilot projects for simple scenarios. In a basic pilot, testers can establish best practices for a larger team. "Focus on the workflow, and don't worry about getting too fancy," Scott said.

At Cerner, pilot programs provide a reality check on a tester's learning curve with new tools. The pilot program sets expectations for managers and other nontesters on the team. "Getting used to versioning code and using new tools for committing our automated test repository can be a challenge," Scott said. Nontesters often focus on speed, but testers must ensure code quality as well.

Paul GrizzaffiPaul Grizzaffi

3. Don't plan to automate everything. "It's a fallacy that all software can or needs to be tested automatically without human intervention today," Grizzaffi said. "Machines can't do all software testing yet. That's a generation away, at least."

4. Examine what automation does well -- and not so well. Computers do repetitive tasks that can be described in detail. Humans don't do repetitive tasks flawlessly. "They fat-finger things, they make mistakes," said Grizzaffi.

5. Keep people involved in the process. User experience tests require both quantitative and qualitative metrics, and the latter are better determined by humans than machines. A person tests basic tasks that users do in an app, such as clicking an icon to go to another webpage. The tester can do this click test in a couple of minutes, so it isn't useful to automate.

An online game producer once told Grizzaffi, "You can't automate fun." The computer will not report if the user enjoyed the app or found it tedious. "Automated tests won't deliver reliable reports on humans' visceral nature of liking to do something." User testing is often where a brain is most valuable.

6. Look for a full-featured tool. An application testing tool should be configurable and componentized to enable some traditional scripted, test-case automation, too, Grizzaffi said. Automated app testing tools can't do everything, but range of capabilities is important.

7. Avoid tools that run on proprietary languages. Commodity and multiple-language support brings flexibility to automated app testing tools. Tools that use commodity languages such as Python or JavaScript can drive multiple types of test scripts and could bring in a third-party library to connect to a proprietary piece of hardware.

8. Focus on the logs. Automated test tools must have an appropriate logging mechanism. Unfortunately, some tools have limited filtering and generation capabilities, so logs are delivered in a jargon that the testers and other consumers of test automation results can't understand. The logs are the first place testers go when a test script fails, Grizzaffi said. A limited logging system on an automated app testing tool should be a deal-breaker.

John SpittaJohn Spitta

9. Choose a tool that complements your style of development. Automated app testing tools should support the programming languages, cloud and development platforms and frameworks familiar to the team, Spitta said. "Don't add 'training the entire DevOps team on new technologies' to your project," he said. "Training testers on new tools is hard enough."

10. Figure out which features are must-haves and the vendor's roadmap for the test tool. Scott's project team requires an image-based tool with image-scaling capabilities and support for multiple-monitor testing, for example. The lack of any of these features is a deal-breaker. Her team also studies the vendor's plans to advance its automated app testing tool. "We want a solution that molds well with our continuous delivery directives and can support a large, fast-growing and fast-moving company," she said.

Use these automated testing best practices to guide your tool selection and overall test automation strategy that fits your environment, business goals and audience.

Next Steps

Can we fully automate our software testing?

Dig Deeper on Software testing tools and techniques

Cloud Computing
App Architecture
ITOperations
TheServerSide.com
SearchAWS
Close