Test organizations continue to undergo rapid transformation as demands grow for testing efficiencies. Functional...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
test automation is often seen as a way to increase the overall efficiency of functional and system test. How can a test organization stage itself for functional test automation before an investment in test automation has even been made? Further, how can you continue to harvest the returns from your test design paradigm once the test automation investment has been made?In this article we will discuss the factors in selecting a test design paradigm that expedites functional test automation. We will recommend a test design paradigm and illustrate how this could be applied to both commercial and open-source automation solutions. Finally, we will discuss how to leverage the appropriate test design paradigm once automation has been implemented in both an agile (adaptive) and waterfall (predictive) system development lifecycle (SDLC).
Test design - selection criteria
The test design selection criteria should be grounded in the fundamental goals of any functional automation initiative. Let us assume the selected test automaton tool shall enable end-users to author, maintain and execute automated test cases in a web-enabled, shareable environment. Furthermore, the test automation tool shall support test case design, automation and execution "best practices" as defined by the test organization. To harvest the maximum return from both test design and test automation the test design paradigm must support:
- Manual test case design, execution and reporting
- Automated test case design, execution and reporting
- Data-driven manual and automated test cases
- Reuse of test case "steps" or "components"
- Efficient maintenance of manual and automated test cases
Test design – recommended paradigm
One paradigm that has been gaining momentum under several guises in the last few years is keyword-based test design. I have stated in previous articles that "The keyword concept is founded on the premise that the discrete functional business events that make up any application can be described using a short text description (keyword) and
- Logon User -
- Enter Customer Name -
Enter Customer Name
- Enter Customer Address -
Enter Customer Address
- Validate Customer Name -
Validate Customer Name
- Select Customer Record -
Select Customer Record
Test design – keyword application
Keyword test case design begins as an itemized list of the test cases to be constructed – usually as a set of named test cases. The internal structure of each test case is then constructed using existing (or new) keywords. Once the design is complete, the appropriate test data (input and results) can be added. Testing the keyword test case design involves executing the test case against the application or applications being tested.
- Consistent – the same keyword is used to describe the business event every time
- Data Driven – the keyword contains the data required to perform the test step
- Self Documenting – the keyword description contains the designers intent
- Maintainable – with consistency comes maintainability
- Automation -- supports automation with little or no design transformation (rewrite)
Test design – adaption based on development/testing paradigm
There are two primary development and testing approaches being used by development organizations today: Adaptive (agile) and predictive (waterfall/cascade). Both approaches certainly have their proponents – though the increasingly adaptive (agile) system development lifecycles are gaining precedence. The question becomes how does this affect the test design paradigm? The answer appears to be that it really does not affect the test design paradigm but it does affect the timing.
About the author
David W Johnson "DJ", A Senior Test Architect with over 25 years of experience in Information Technology, across several business verticals, having played key roles in business analysis, software design, software development, testing, disaster recovery and post implementation support. Over the past 20 years developed specific expertise in testing and leading QA/Test team transformations – Delivered Test: Architectures, Strategies, Plans, Management, Functional Automation, Performance Automation, Mentoring Programs and Organizational Assessments.