Does limited integration among tools in an application lifecycle management suite prevent teams from getting the most out of ALM?
To paraphrase: It's the process, stupid! Like all engineering projects, software and otherwise, application lifecycle management is about process. There are steps we take and tools we use for each part of the journey. We need those tools integrated with one another so that we can ensure the integrity of the information as it flows down through the ALM process.
For decades, we have been integrating business systems so that there is seamless linkage from order, to distribution, to invoice, to payment. But we have never fully applied the concept of seamless integration to our development tools. Instead, we live in a world of point solutions that are integrated in such a fragile manner that even the smallest change limits our ability to work. Would finance accept such poor tools from us? Would we let them? Heck, no!
We are barely scratching the surface of ALM integration today.
To make up for the fragile integrations, we develop workarounds, bridges, scripts, conversion routines and all that other paraphernalia that hold our infrastructure together. All this adds up to a lot of rework. Figures vary, but data I have collected and studies I have seen from industry analysts show that this rework can represent as much as 50% of total IT effort. Rework involves things like cutting and pasting information from the requirements tool into a specification, redoing the build script of the user acceptance testing environment, and manually mapping requirements to test cases. These are steps that ought to be fully integrated but rarely are.
Even when integrations between tools exist, they are often not much more than an automated cut-and-paste. What we need is integration at the process level so that work we do in one phase of the lifecycle in one tool connects at the process level to the next. Events in the ALM process should inform and guide how the integration behaves. For example, if a build fails, a ticket should open in the developer's in box. How ironic is it that we manually run the automated test suite? Should it run when the developer checks in the code, and run only the scripts that exercise the code that has changed? And if the tests fail, shouldn't they put tickets in the developer's inbox too?
We are barely scratching the surface of ALM integration today. I know the mega-ALM vendors will say, "Look, all of our tools are integrated together." But vendors that make that claim typically do not offer best-in-class tools for each stage of the lifecycle. Depending on your risk profile, your time-to-market pressures, your compliance and audit needs, you should select point tools that meet your exact needs rather than opt for generic one-size-fits-no-one solutions.
Demand that your vendors expose their application programming interfaces (APIs) as Web services and take control of the integration through process-centric tools that let you organize your development infrastructure based on process.
Dig Deeper on Topics Archive
Related Q&A from Kevin Parker
Add controls to the business of delivering software, and teams will scream about delays. However, fast development is often the result. Continue Reading
Kevin Parker discusses the pros and cons of industry analyst reports and advises when it might be best to trust your own instincts. Continue Reading
Actually, application development veteran Kevin Parker says ALM is really a part of the APM process when you look at it from a distance. Continue Reading