There's no wall at SmartSignal -- that is, there's no wall between the developers and quality assurance (QA) engineers....
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Because if there were, the company wouldn't be able to accomplish what it does so effectively.
"We are part of the development organization -- since day one," said George Cerny, quality assurance manager at SmartSignal Corp., which makes software that provides real-time analytics on instrumented industrial equipment.
"Usually in companies there's a wall where you throw the code over. We [QA] get into the code and try to diagnose when failure happens, why and what component is causing it, so developers can get at the root of what's going on," Cerny said. "We work closely with the application and technology team. Only through communication can we develop this stuff. If you change one tiny setting it can change results. We wouldn't survive without that close communication."
Developers at the Lisle, Ill.-based SmartSignal also follow agile software development methodologies, which means iterative development and daily builds and validation. To keep pace, as well as to ensure the quality of the company's complex software, Cerny's QA group relies on the SilkTest automated functional and regression testing tool from Borland Software Corp.
SmartSignal serves power generation, oil and gas, pulp and paper, and aviation industries with its EPI*Center software. EPI*Center is used to monitor different types of assets, such as a fuel pump, and predict when those assets going to fail before they fail, according to Cerny. The privately held software solutions company was founded by the University of Chicago to commercialize the Similarity-Based Modeling (SBM) technology.
SmartSignal's application engineering group analyzes data from pieces of equipment, such as an oil pressure sensor or vibration sensor, determines the optimal operating state and how the sensors relate to each other, and develops models of the equipment operating optimally. Then live feeds are pulled from the sensors as they operate and the data is run against the models. If there are deviations, the software can provide alerts or rules can trigger an incident that the customer can drill into to start diagnosing the problem, Cerny said.
How automated testing helps
Cerny was an avid user of automated testing before coming to SmartSignal six years ago. When he joined the company, he was the only tester on a team of 15 developers.
"To keep up with the developers, we had to implement automated testing," he said. "A lot of the stuff we validate is complex math algorithms. It would be impossible to eyeball this stuff; automated testing allows you to refine baselines and what you're looking at to machine-level precision. We are able to detect very slight changes in functionality, which would resonate through the system."
When Cerny came on board, SmartSignal was working on the earlier version of its product. Cerny brought in Borland's SilkTest for automated functional and regression testing, a tool he first began using before Borland acquired it from Segue.
"I've used all the major tools out there; each has own powers," he said. "SilkTest's power is in the flexibility that allows developers to go in and extend the tool in limitless ways. There are countless examples where other tools couldn't support what we're doing with SilkTest. I can develop my own framework and support for third-party technologies that aren't supported by any of the tools."
Other products, Cerny said, are geared more toward manual testers doing record/playback, or to get intermediate developers up and running to develop test suites. At SmartSignal they take it from a different approach -- they're developers developing object-oriented programming automation, he said.
As an agile shop, automated testing is essential to keep up with a constant stream of features, enhancements and defects, Cerny said.
"On a daily basis we do builds and test cycles, so developers are constantly up to date on the state of the build. Without automated tools that would be impossible," he said. "We could do unit tests, but we have several data stores, like SQL, a G2 rules engine [from Gensym], local access data stores, file data stores, so unit tests can't properly test the modules across the application. You may test one module, but rarely do bugs show up like that, so you can't get a true integration test."
With agile, the lines between QA and developers are blurred, added Brad Johnson, product marketing director for Lifecycle Quality Management at Borland in Cupertino, Calif.
"The agile shops are challenging everything, and the people most challenged by agile methodology are the testers," Johnson said. "Testers now have to figure out how to do short testing cycles. If you're going agile and you're not adapting your processes, you're heading for trouble. Automation has really shined in the agile environment. You can't have manual testers running these larger regression suites in a highly iterative environment. You need to run ten thousand tests automatically, very often."
SmartSignal also benefited from the efficiency of using automated testing when it started developing its current EPI*Center product.
"The real benefit of automation was we were able to port all the test cases and plans to EPI*Center from ECM," Cerny said. "The test cases were testing the same algorithms; just the user interface was different. We were able to take the existing test plans and business processes behind them and port them over."
Smart Signal has implemented automated testing for more than 400,000 lines of code. There is just one test plan for each of SmartSignal's products. Each test plan includes up to 4,000 test cases and runs overnight, unattended, across multiple Borland SilkTest execution machines, to validate daily builds. Defects are imported into SeaPine TestTrack Pro for defect tracking and cross-referenced back to defects identified in each test plan, according to the company.
Improved application delivery
A second way SmartSignal uses SilkTest is rather unique -- to automate the structure creation process and drive the application to the customer end point more efficiently and accurately, Cerny said.
"We had developed an interface to MATLAB, which takes data from clients, analyzes the data, and run a series of automated algorithms to set up particular instances. The problem was there were thousands of settings; if you missed one, it would produce false readings. We were able to take that information from MATLAB and use SilkTest to drive the application to the end point. By doing so, the quality of how we delivered all these models to customers was 100% -- there was not one error."
In addition to saving hours of work, Cerny said it also allows the company's high-level engineers to apply their knowledge to the application rather than delivery.
Johnson acknowledged SmartSignal's creativity for integrating with other third-party products.
"Test tools are typically too expensive to buy for just data loading, but when you've got it, it's nice to have," he said. "These guys are using SilkTest effectively in a traditional testing manner, but because their systems are configured for each customer, they're using SilkTest scripts to configure the system they provide to the customer. So, they're using it as a tool to help load the system with the right configuration, leveraging a more open licensing model we have. They're a great example of smaller company that can afford to be nimble. They're small and growing and stretching tools to the max."
Dig Deeper on Automated Software Testing