Manage Learn to apply best practices and optimize your operations.

Avoiding software failures: Lessons from

What could have done differently? Jenn Lent discusses three software development lessons the U.S. government learned the hard way.

With the spotlight on the recent software failures, the government is learning some hard lessons that software pros have known for a long time:

  • Performance is not an afterthought.
  • Integration testing is central to any software development effort, and until it's complete, the application isn't ready to go live.
  • Finger-pointing doesn't fix broken software -- and it's a sign of a dysfunctional organization.

In this edition of Quality Time, I elaborate on these three lessons.

Software failure lesson No. 1: Performance is not an afterthought

Six weeks after the launch of, the troubled website still suffers from sluggish performance and unacceptable response times. An estimated 50,000 Americans have successfully signed up for a health insurance policy, but that number falls far short of the government's earlier estimates, which predicted 500,000 people would enroll in the first month alone.

The government, it seems, is learning a lesson the hard way: Failure to conduct performance testing -- and provide sufficient server resources -- can result in software failures and public embarrassment that damages the reputation of the organization behind the application. Retail businesses learned that same lesson in the early days of e-commerce, when their websites failed to accommodate holiday shoppers, who had flocked online in unanticipated numbers.

Public embarrassments are always tough to weather. But things are tougher now than they were when the early e-commerce disasters unfolded. At that time, social media as we now know it did not exist. Today, when customers suffer a bad experience with a high-profile website, they can instantly spread the word via Twitter or Facebook.

Software failure lesson No. 2: Integration testing matters

If the application doesn't work, take ownership. Say it doesn't work -- then work hard to fix it.

Integration testing -- where individual units of a larger system are brought together to see if they work as a group -- is a given for complex applications. And virtually all systems under development today are complex systems. They pull information from other applications and databases created by disparate parties at different times. They are hosted at a wide range of locations, often in the cloud. And most of them depend heavily on third-party services. 

Those could be offerings like Google Analytics or financial services that carry out such tasks as checking credit scores. The collective operation of these services, applications and databases can have a huge impact on performance. If the application under development has not been subjected to continuous integration testing, it simply isn't ready to go live.

This notion was apparently lost on CGI Federal, one of the leading software development providers for The company's spokespeople have said publicly that CGI was not responsible for the end-to-end testing necessary to ensure the software worked with all elements of the site to prevent software failures. Instead, its spokespeople pointed the finger at other developers that had helped build

Software failure lesson No. 3: Finger-pointing doesn't fix broken software

Solving performance problems -- figuring out what went wrong and how to fix it -- requires strong working relationships with all players involved in the software development process. Like performance and integration testing, it seems as if this idea was news to the providers that worked on

Looking ahead, fixing the finger-pointing problem is a big challenge, because it's about a lot more than simply not laying the blame elsewhere.

The whole team approach in Agile development provides a useful model for understanding what's really at stake. In Agile projects, all team members are accountable for the quality of the software under development. If testing finds a bug, it's not the other guy's fault. Each team member jointly shares responsibility. That concept forces members to act in one another's best interest -- and the interests of the larger team -- right from the start of the project. That's very different from waiting until software fails and saying, "Hey, it's not my fault."

The prevalence of finger-pointing -- in government or in business -- is an indication of a far bigger concern: an organizational culture that allows employees at every level to say, "It's not my problem."

The spectacle of major software development players pointing fingers at each other in a project of this magnitude undermines the credibility of the software industry as a whole. That's the lesson for all software professionals: If the application doesn't work, take ownership. Say it doesn't work -- then work hard to fix it. Anything less is unprofessional.

How do you deal with software failures? Do you think could have been handled differently? Let us know.


UPDATE -  an earlier version of this story stated 50 million people had already signed up for healthcare. That number has been corrected to the actual estimate of 50 thousand. 

Dig Deeper on Topics Archive