chris - Fotolia
A lifecycle describes a sequence of phases or major events and activities that leads to success in some endeavor. Lifecycles typically go from cradle to grave, whether you're a person or a software project.
The software development lifecycle (SDLC) provides a framework for IT professionals to follow when they work with an application or software. The SDLC iterative model, which explains how software goes from concept to deployment, closely aligns with the project management lifecycle.
The diagram in Figure 1 shows both together.
Lifecycles are defined to help projects succeed, not to create a mass of bureaucratic busywork. Use the SDLC iterative model to guide development, test and deployment and support work in an informed and flexible manner to increase the chances of success.
Requirements, design and development
An IT professional spends the bulk of a project identifying requirements, designing the system or product and then testing that it meets design requirements.
Correspondingly, project managers spend most of their time directing and controlling the execution of these three phases, which they ordinarily iterate several times within one or more increments. The team uses each iteration to identify and elaborate upon additional requirements, add design elements to satisfy them and then develop the additional design elements.
Use requirements analysis to discover the detailed REAL business requirements that provide value when delivered. Start with a product, system or software definition, which often is referred to as a functional requirement even though it's actually a form of high-level design. This high-level design often focuses on features, each of which with functionality that is then typically described in more detailed designs. Low-level designs lead to more detailed engineering and technical designs, which developers implement in hardware and software.
Roll testing into iterative design
The SDLC iterative model includes reviews of each of the development deliverables: requirements, designs and code. Reviews can catch and fix errors before they affect subsequent artifacts. QA should combine unit, integration and system testing with the development of code to further help catch errors early in the software development lifecycle when they are easier and cheaper to fix.
Many organizations treat testing as a separate phase performed at the end of each major increment, but this approach is ineffective because it detects errors late in the SDLC iterative model. Testing is more effective when the QA team plans and designs its strategy as much as possible prior to incremental development.
All development methodologies define requirements, design solutions and develop the design -- in that order. Few methodologies explicitly identify test planning and design as important lifecycle activities, especially not as part of system design prior to development stages. Methodologies mainly differ in how they follow the three phases and in the number of times they repeat this sequence. Typically, the more times phases repeat, the smaller the size of functionality in each repetition.
At one extreme is the Waterfall methodology, which many people erroneously believe must be slavishly inflexible: Define all the requirements, then design all the system before developing all the design and never have feedback or go back to a prior phase of the SDLC iterative model. Waterfall projects often require that the team define, design and develop large pieces of functionality at a time. Agile development, at the other extreme, focuses on small pieces of functionality, with revisits and reworking of much of what already has been done.
Implementation, operations and maintenance
An increment in a SDLC represents a major piece of added functionality, which gets implemented into production as it becomes workable, often in the form of a build or release. Frequently, each increment must pass user acceptance testing before deployment into production.
Once implemented, the organization must operate and maintain each increment until it is eventually retired or replaced. Most organizations fail to include the operations and maintenance phase in the SDLC iterative model. Instead, they characterize implementation as the end of development stages, which places a premium on hitting a deadline, without taking into account the effect on costs to support the finished product.
Closure occurs when the development stages finish, typically after the team implemented all the key functionality required. The project could represent multiple increments, although many projects have only one increment. Conversely, a big project can divide several subprojects, each of which is treated as a separate project, follows the SDLC iterative model and can have its own iterations and increments.
After the end of a project, many teams conduct a post-implementation, or a post-mortem, review. The purpose of this type of review is to learn lessons from the project experience to improve the development and project processes on subsequent projects. However, most organizations find that their lessons learned remain the same project after project. In other words, they haven't really learned any lessons, because they don't apply them. Effective project management learns and applies lessons throughout the project, as well as from project to project.
Feasibility analysis is an important methodology differentiator that teams often omit from the SDLC iterative model. The type of evaluation compares major alternative approaches, to determine whether each approach could achieve desired results -- and, ultimately, which approach is most economically effective. Most projects only calculate ROI once -- if at all -- at the very beginning during feasibility analysis, instead of at various points throughout a project.
Whether or not an organization conducts feasibility analysis explicitly, every project needs this phase, because that's where initiation, planning and organization occur. Feasibility analysis defines the project and sets its budget and schedule. Ineffective feasibility analysis can lead to impossible budgets and schedules, which can destine a project to fail.
One project-dooming cause of inadequate feasibility analysis stems from the failure to discover the top-level REAL business requirements that provide value when met. Evaluate the feasibility of each alternative approach with respect to its ability, and cost, to meet those specifications. During the later system evaluation phase, the top-level business requirements become more detailed and form the basis for high- and low-levels of system design.
However, most projects fail to adequately identify the business requirements. Instead, they start with the high-level design of the product, system or software they expect to create and thus have no meaningful basis for assuring it will provide value -- let alone reliably measuring its financial ROI. These projects invariably suffer extensive creep in order to fix inevitable functional shortcomings, which, in turn, lead to budget and schedule overruns.
Measure twice, iterate once
When they wield it effectively, product planners can use feasibility analysis to identify and sequence specifications, iterative design and development.
In contrast, most iterative development teams simply plunge into coding pieces of the product before they determine what pieces they need, how those pieces will fit together, and what the most appropriate sequence will be to create and integrate them. When an organization doesn't take advantage of the feasibility analysis phase, it often needs time-consuming and expensive iterations to correct architectural shortcomings.
Dig Deeper on Building security into the SDLC (Software development life cycle)
Related Q&A from Robin F. Goldsmith
Using a WBS can help make a big task like requirements easier. Expert Robin Goldsmith explains how developers and testers can make the most of this ... Continue Reading
How do you engage high-level business executives in the process of writing business requirements? Continue Reading
Why don't users seem to appreciate typical software QA testing status reports? Continue Reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.