The previous article in this series, "Improved software testing via a Testing Center of Excellence," covered the...
basic structure of a Testing Center of Excellence (TCE) and gave a brief overview of the roles and responsibilities that exist within the structure of a TCE. Let's take a closer look now at the specific responsibilities each participant has.
The TCE creates a group of testing specialists and technical testing components that can be used to leverage testing knowledge, technology, methodology and resources across suitable engagements. The testing specialists will educate and supplement the testing resources that exist within each project team. This will allow for maximum project penetration with a minimum number of human resources. This group will maintain the deliverables as they are created and provide them as a framework and samples for future efforts. This process will increase the efficiency of the testing effort as reusable components grow.
Figure 1: TCE Structure
The TCE core team includes the following:
- Test managers
- Test architects
- Test methodologists
- Test automation Engineers
- System matter experts (industry experts)
- TCE coordinators
This team is strongly partnered with the following:
- Environment specialist
- Quality assurance
These are roles and responsibilities, not resources. Therefore, one resource could fulfill multiple roles within the TCE structure. The environment specialist role is included, but with a proviso -- any environment activities/roles that are for the benefit of the entire project/organization are not part of TCE, such as configuration management.
The roles of quality assurance and testers are not included within the TCE, but they are closely partnered to the TCE core team. Quality assurance is a discipline unto itself that should be consulted by the TCE to improve the TCE model, processes and deliverables. Testers are members of the testing team that will benefit from TCE participation.
Test manager responsibilities
- Champion the test competency center
- Escalate test issues for resolution
- Manage TCE human resources and budget
- Police testing methodology usage
- Assist in establishing and maintaining Service-Level Agreements (SLA)
- Provide SLA knowledge to project teams
The role of test lead/manager is to effectively lead the testing teams. To fulfill this role, the test lead must understand the discipline of testing and how to effectively implement a testing process while fulfilling the traditional leadership roles of a manager. That means the manager must manage and implement or maintain an effective testing process. This involves creating a test infrastructure that supports robust communication and a cost-effective testing framework.
Test lead/manager responsibilities
- Define and implement the role testing plays within the organizational structure.
- Define the scope of testing within the context of each release/delivery.
- Deploy and manage the appropriate testing framework to meet the testing mandate.
Implement and evolve appropriate measurements and metrics.
- To be applied against the product under test
- To be applied against the testing team
- Plan, deploy and manage the testing effort for any given engagement/release.
Manage and grow testing assets required for meeting the testing mandate:
- Team members
- Testing tools
- Testing process
- Retain skilled testing personnel.
Managing or leading a testing team is probably one of the most challenging positions in the IT industry. The team is usually understaffed and lacks appropriate tooling and financing. Deadlines don't move, but the testing phase is continually being pressured by product delays. Motivation and retention of key testing personnel under these conditions is critical. How do you accomplish this seemly impossible task? I can only go by my personal experience as a lead, manager and a team member:
- If the timelines are impacted, modify the test plan appropriately in terms of scope.
- Clearly communicate the situation to the testing team and project management.
- Keep clear lines of communication with development and project management.
- Whenever possible sell, sell, sell the importance and contributions of the testing team.
- Ensure the testing organization has clearly defined roles for each member of the team and a well-defined career path.
- Measure and communicate the testing team's return on investment. For example, if the detected defect would have reached the field, what would have been the cost?
- Explain testing expenditures in terms of investment (ROI), not cost.
- Finally, never lose your cool.
Test architect responsibilities
- Be the keeper of the test architectural vision
- Formulate the test architectural goals (short and long term)
- Ensure appropriate software testing tools are selected to meet these goals
- Integrate tools, processes and methodologies into a cohesive whole
- Provide testing frameworks and templates to projects
- Champion the evolution of the TCE architecture
The role of the test architect is to set and keep the test architectural vision for the TCE. This is probably the most critical role within any TCE and should be undertaken only by senior testing personnel who have experience as a test manager, test methodologist, test designer or test automation engineer across several industries. The test architect is responsible for selecting and integrating the appropriate set of tools, processes and procedures to ensure overall testing efficiency. The test architect works closely with the TCE manager, test methodologist, senior test automation engineers, software vendors and quality assurance when formulating and implementing the testing framework.
Test methodologist responsibilities
- Provide education to resources on the testing methodology
- Work with QA for continuous improvement of the testing methodology
- Provide guidance to project resources on applicable use of the methodology and project organization for testing
- Evaluate and recommend approval of test strategies
- Ensure proper methodology coverage of the technical testing processes
The role of the test methodologist is to provide the process, procedures and templates that support effective test design/testing. These include processes and procedures to support:
Test case design
A test case design is not the same thing as a test case. The design captures what the test designer/tester is attempting to accomplish with one or more test cases. This can be as informal as a set of notes or a formal deliverable that describes the content of the test cases before the actual tests are implemented.
Test case construction
A test case is a sequence of steps designed to test one or more aspects of the application. At a minimum, each test case step should include a description of the action, supporting data and expected results. The test case deliverable can be captured using a "test case template" or by using one of the several commercial, freeware or shareware tools available.
Test case execution
Test case execution is the actual running or execution of a test case. This can be done manually or by automated scripts that perform the actions of the test case.
Capturing test results
Capturing test results is a simple itemization of the success or failure of any given step in a test case. Failure of a test case step does not necessarily mean that a defect has been found; it simply means the application did not behave as expected within the context of the test case. There are several common reasons for a test case step to fail: invalid test design/expectations, invalid test data or invalid application state. The tester should ensure that the failure was caused by the application not performing to specification and that the failure can be replicated before raising a defect.
The tester documents any defects found during the execution of the test case. The tester captures the tester name, defect name, defect description, severity, impacted functional area and any other information that would help in the remediation of the defect. A defect is the primary deliverable of any tester; it is what is used to communicate to the project team.
Test coverage analysis
The tester must determine if the testing mandate and defined testing scope have been satisfied. Then he must document the current state of the application. How coverage analysis is performed is dependent on the sources available to the tester. If the tester is able to map test cases to well-formulated requirements, then coverage analysis is a straightforward exercise. If that is not the case, the tester must map test cases to functional areas of the application and determine if the coverage is "sufficient." This is obviously more of a "gut-check" than a true analysis.
Test automation engineer responsibilities
- Be skilled in the practical use of testing tools
- Train end users on usage of testing tools
- Administer the testing tools
- Provide information on upgraded and new testing tools
- Perform technical testing (performance, stress, capacity, etc.)
The role of the test automation engineer (test engineer) is to design, build, test and deploy effective test automation solutions. To fulfill this role, the test engineer applies appropriate automation technologies to meet the short- and long-term goals of the testing organization. The objective is to automate as much of the testing effort as possible with a minimum set of code/scripts. The focus should be on test effort, not testing coverage. If one manual test case or manual test preparation process consumes a large percentage of test resources, then this manual process should be the first to be automated.
System matter experts -- industry trend specialist responsibilities
- In-depth knowledge of a particular industry trend
- Adapt methodology to include trend-specific processes (i.e. data aging)
- Advise on test depth and applicability of test stages for trend
System matter experts (SME) or business analysts (BA) may or may not exist within the context of the TCE. If the TCE is an internal organization, then the SMEs and/or BAs will probably exist as a distinct entity within the overall organization. In that case, the TCE must partner closely with the SMEs and/or BAs to ensure the business being tested is clearly understood. If the TCE is an external organization (outsourced), then the TCE should contain SMEs and/or BAs that act a liaisons to the client and as TCE experts to ensure the business being tested is clearly understood.
TCE coordinator responsibilities
- Schedule shared project resources (human and environmental)
- Manage reusable business components of test (plans, cases, data, schedules, scripts)
- Publish testing schedules
The role of the TCE coordinator is to schedule the consumption of TCE resources, including software, hardware and human resources. The TCE coordinator works closely with the TCE managers, architect and environment specialist to ensure the schedule can deal with any planned (or unplanned) overlaps of resources. This is a classic resource matrix, but in the case of a TCE it must be communicated to all TCE partners and clearly understood. This is especially true when multiple clients/partners share a common resource pool.
Environment specialist (lab support) responsibilites
- Administer hardware, software and networks (applications and systems)
- Perform technical capacity planning
- Support technical testing (performance, stress, capacity, etc.)
- Provide primary interface to technical support groups
- Provide technical consulting to project teams
- Manage reusable technical components of testing
- Monitor technical environment during testing
The role of the environment specialist (lab "rat") is to ensure the hardware -- and to a certain extent the software -- of the TCE will support all planned testing activities. The environment specialist is analogus to a chief mechanic -- he keeps it all running.
This preliminary introduction into the roles and responsibilities of the TCE team will be followed by articles on each of the roles, starting with the role of a test automation engineer.
About the author: David W. Johnson is a senior computer systems analyst with over 20 years of experience in IT across several industries. He has played key roles in business needs analysis, software design, software development, testing, training, implementation, organizational assessments and support of business solutions. David has developed specific expertise over the past 10 years on implementing "Testware," including test strategies, test planning, test automation and test management solutions. You may contact David at DavidWJohnson@Eastlink.ca.