Moving to a service-oriented architecture (SOA) not only impacts how an application is developed, but also how it is tested. A composite or orchestrated application has a lot of moving parts and interconnections, making complexity a key challenge. To address the challenges of SOA testing, industry experts recommend more collaboration across project teams, earlier involvement by testers, the use of automated tools, and good contract management for third-party components.
The biggest difference from traditional application testing is complexity, said Madeline Bayliss, vice president of strategic partnerships and marketing at Green Hat Software, a provider of automated SOA testing software in Claymont, Del.
"Composite applications are not just about functionality, but orchestration and architecture and business processes; there are messaging, [Enterprise Service Bus], [Business Process Management] platforms, and governance systems to address. The other difference is there is a much greater need for simulation," she said, particularly for complex business processes that use external or third-party services.
The need to deal with machine interfaces is also a key difference, according to John Michelsen, chief scientist and co-founder of SOA testing software provider iTKO in Dallas.
"You have to understand testing at the component level, which means dealing with machine interfaces vs. human ones," he said. "Most traditional testing is geared to test human interfaces."
This causes QA to become more a part of the development cycle, Michelsen said. "Developers did testing in the middle tier in the past." With an SOA-type application, whether it's a Web service or an ESB, "it's a bunch of plumbing," he said. "They have to wrestle with what it means to test that stuff."
Multiple test teams required
SOA testing also involves multiple test teams, according to Rizwan Mallal, managing director of Crosscheck Networks, a SOA testing tools provider in Newton, Mass.
"Since SOA applications are reusable and highly distributed, multiple teams are involved in the testing process. For example, testing teams from the producing side of SOA applications have to test the application before deployment, while the testing teams from the consuming side also have to test the SOA application before integration," he said.
Traditional black box and white box testing are applied to testing SOA, but because "enterprise SOA typically exposes a schema to the testers, it gives them better visibility into the structure of an SOA application. This enables them to enter the domain of gray box testing," Mallal added.
SOA testing isn't new
Scott Barber, chief technologist of testing services company PerfTestPlus in Palm Bay, Fla., and executive director of the Association for Software Testing, said SOA testing really isn't new.
"People have been doing the same thing in-house ever since the word 'middleware' came out, the difference being we were inventing our own protocols and architecture from the developer side," he said. "I think the buzz around SOA in general has brought SOA into organizations and test groups that have never done this type of testing before."
Michelsen said, however, that there are some unique challenges to SOA testing. First, QA's role changes from that of a surrogate for the end user to that of a surrogate for the consumer of a service. Second, QA now has to deal with all the technologies and protocols involved, rather than just the user interface. And third, moving toward an agile development methodology is typically a goal of SOA, which means the testing timeframe is compressed.
"In the past you could manually test and get away with it," he said, "but now you can't possibly keep up being purely manual."
Automated tools vendors are capitalizing on this need, Barber said. "Vendors have built tools that put a nice front end on top and make it easy to build data and pump it through a tool. Doing that by hand for most testers is a skill they don't have."
The role of QA also changes to some extent in an SOA environment.
"Corporations struggle to define the demarcation of SOA testing responsibility between developers and QA teams," Mallal said. "In other words, what is that clean handoff where developers can safely hand over test cases to QA for further testing?"
Instead, he said, "developers have to treat their SOA QA teams as part of the design process and not bring them in as an afterthought. Each SOA design team or committee should have a representative from the QA team present. Not a figurehead but a person who has clout in the decision making process of the architecture."
Also, said Bayliss, organizations that adopt the test-driven development (TDD) concept leverage the highest value of SOA testing. However, she said, "We'd change the definition of TDD to say it starts at the business requirements, not the testing requirements. That said, people will on-ramp to SOA testing at different junctures," and it's OK to take smaller steps, she said.
Finally, Barber said, SOA testers face a mind shift -- and organizational shift -- regarding the use of third-party services.
"It's the notion that some part of what we are used to testing and validating is being transferred to people signing the contracts," he said. "The test team should be advising them about the kinds of things they're concerned about so they can write a complete contract. People managing those relationships should be going to senior members of test teams and listening to what they have to say."
Story continues with "The consequences of overlooking SOA testing blind spots."