Beating the odds: Managing a successful software project

In Managing the Black Hole: The executive's guide to software project risk, author Gary Gack has stated that less than one third of software projects are considered fully successful. In this interview, SearchSoftwareQuality.com editor Yvette Francino asks Gack some probing questions about that claim in order to understand more about how teams can beat the odds.

At least two-thirds of software projects fail, and development teams are often unable to determine why, according to software consultant Gary Gack, author of the book, Managing the Black Hole: The executive’s guide to software project risk. In this interview, Gack shares his observations about project failure causes; whether methodology choices matter; quality assurance (QA); and defects’ role in project failure. He also suggests ways...

software development teams can beat the odds and deliver applications successfully.

SSQ: In your book, Managing the Black Hole, you start by giving us some facts about the software industry. One of those is "Less than one-third of software projects are fully successful - delivered on time, on budget, with all promised functionality." Can you give us some more background to these statistics? How big were the projects? What type of software was being deployed? What were the consequences from missing the goals?


Why software projects fail:
Software industry research fims' surveys show that development projects fail in these ways:
  • Underestimated costs and overestimated benefits contribute to 80 percent failure rate. (Mercer Consulting)
  • Poor requirements gathering and management are directly responsible for 60-80 percent of failures. (Meta Group)
  • Incorrect requirements -- a result of poor business, IT and development communication – are largely responsible for a two-thirds software project failure rate. (Forrester Research)
  • Identifying and fixing defects often account for 80 percent of project costs, causing project-breaking cost overruns. (National Institute of Standards and Technology)

 The stats I cited were from the Standish Group, and unfortunately they do not provide any information about project size or other attributes. Certainly there are some in the industry who are in denial about the rate of failure, but very similar results are reported by the Gartner Group and by Capers Jones. Jones data clearly show failure rates are dramatically higher for larger projects, and larger projects, while relatively few in number, account for something like 80 percent of all software spending. I believe these data, while not precisely correct, reasonably reflect the real state of the industry. In Chapter 2 you mention the wide array of standards (ISO, IEEE, SEI, PMI, ITIL) and methodologies (waterfall, agile, iterative, hybrid models.) Do you notice trends or commonalities in standards or methodologies in the projects that are considered fully successful?


In his book "Software Engineering Best Practices" (2010) Capers Jones identifies 14 Best Practices (p.24) – the only items in that list that are methodologies are TSP and PSP. The only item in the list that is the subject of a standard is software inspection (IEEE 1028-2008). Everything else in the list might better be called a "practice" that could be incorporated into virtually any methodology. Outcomes are greatly influenced by practices – i.e., what people actually do – rather than by the "methodology" they claim to be using. In practice there is enormous variability in what practices are actually used within any given method. All of the most important practices focus on quality (defect containment) at every stage of the development process. How does quality factor in when measuring a project's success? If a project has the functionality requested and needs to be delivered with little or no testing, would it still be considered a success? Would the team that delayed the project to test be considered unsuccessful for missing time and budget goals.


There's an old saying in the software business – "if it doesn't have to work we can deliver it really quickly". Unfortunately, not many customers assign business value to things that don't work. It's hard to imagine how anything more than a trivial project could deliver software that works with little or no "testing" – but conceivably a project that did very thorough inspections and used a static analysis tool MIGHT be able to deliver acceptable quality without testing. However, if there's no time to test it's unlikely any other defect removal methods would be used instead. 

SSQ: I've always had a problem with statistics that deal with "defects" because they don't distinguish the difference between small defects and large defects. The charts show that more than 40% of the defects originate in Requirements and Design. How does this translate to agile environments where Requirements and Design are revisited with each short iteration?

Actually, all of the defects statistics I cite do make that distinction – all refer to "major" defects – i.e., ones that are severity 1 (software does not run) or sev 2 (major function disabled). When we think about "defects" in an Agile context it's easy to slip into semantic distinctions that may not be meaningful from a customer's perspective. Suppose, for example, we find in a current iteration (sprint) that the requirement(s) we are working on now are in some way in conflict with work we did earlier. We must make changes to work we previously considered "complete" – this is rework, and it costs time and money. Does the customer care whether or not we call that a defect? Hypothetically, at least, it may have been possible to foresee the conflict – if we could have foreseen the problem, but did not, is that a defect? Does it matter? The importance of "defects" is that they entail rework that costs money and time for both developers and customers. In any case I see VERY few agile projects that do any defect tracking. 

SSQ: What ALM tools, if any, do you recommend for software development? What measurements are most important during a software development cycle? Which types of tools are most important for success?

On the whole I fear our industry is over-focused on tools. Sometimes tools become the tail that wags our dog. In short, if it is clear that a tool reduces effort and/or improves delivered quality at a cost not otherwise achievable, by all means use it. In my view 2 specific measures are essential to real improvement: (1) defect containment – i.e., what percentage of defects are found and fixed before software delivery and by appraisal process and (2) what percentage of total project effort (summed across all iterations if iterative) is "value added" – i.e., what percent of total effort is NOT appraisal (finding defects) or rework (fixing defects) INCLUDING BOTH PRE- AND POST-RELEASE EFFORT. Most organizations today incur post-release costs at least 50% more than pre-release, and sometimes 100% more. Most groups have their heads in the sand about this and don't realize how much waste is due to poor delivered quality. I've gone into this in considerable detail in my book – it's a bit too involved to fully explore here. 

SSQ: What would be your single biggest piece of advice to software managers for managing a successful project?

It's my understanding that the genie always grants 3 wishes, so I'll not limit myself to a single piece of advice. My top three are: (1) measure defect containment- as I've established in my book, defects are by far the largest cost driver (2) measure value-added vs. non-value-added effort to get a meaningful understanding of "productivity" in aggregate across complete projects and across projects within organizations, independent of domain and technology factors, and (3) manage with facts and data – as Deming put it, "In God we trust, all others bring data."

Dig deeper on Application Lifecycle Management Software Fundamentals

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSOA

TheServerSide

SearchCloudApplications

SearchAWS

SearchBusinessAnalytics

SearchFinancialApplications

SearchHealthIT

Close