News Stay informed about the latest enterprise technology news and product updates.

The state of software quality, part 1: Problems remain, but all is not doomed

Many experts say software quality hasn't improved much over the years, despite the increased recognition and attention paid to quality issues, but there are some bright spots, such as better tooling and the influence of Agile development methods.

Quality, like beauty, is in the eye of the beholder -- the user, in the case of software development.

While quality is indeed subjective, there are metrics the software industry has traditionally used to benchmark quality: Is the project or product on time and within budget? Does it meet user requirements? How much of the functionality is actually getting used? Is the code well-written? Is it free of serious defects and security vulnerabilities? Does it deliver value to the organization? To the user?

To date, the metrics have shown that software has not been measuring up, and problems with poor quality have plagued the industry. Many experts say things haven't improved much over the last several years, despite the increased recognition and attention paid to quality issues.

"Over the last 50 years there has been very little improvement," said Watts S. Humphrey, who founded the Software Process Program of the Software Engineering Institute (SEI) at Carnegie Mellon University, and was the director of programming at IBM.

To get a quality product you need two things: programmers personally committed to producing high-quality stuff and managers and customers that demand high-quality work.
Watts S. Humphrey
FounderSoftware Process Program of the Software Engineering Institute

"I'd just say it's terrible, and not getting better," said Scrum co-creator and evangelist Ken Schwaber. "You can build new stuff fast, but the problem is putting it into the old stuff. We have to attach it to the old stuff because it makes us competitive."

But there are quality problems with the core infrastructure of many software products, he said, which makes modifying functionality difficult. And there may be few people left in the company who worked on the program originally.

Both commercial independent software vendors (ISV) and large enterprise IT organizations are grappling with quality issues, according to Joe Niski, an analyst at Midvale, Utah-based Burton Group.

"The quality of most packaged commercial software doesn't seem to be going anywhere fast," he said. "And in the enterprise, whether it's custom software developed in-house or outsourced, from what I've seen [quality] remains a big problem."

Development changes often difficult
Although some large enterprises over the past 10 years have done some work toward adopting lighter-weight and iterative methodology and agile development, Niski said it's hard for large organizations to shift how they develop.

"Development methodologies are tied to company culture and funding mechanisms," he said. "Despite all the work on TQM [Total Quality Management] and the management fads of the '80s and '90s, many still have a command-and-control, top-down management style. That bubbles into typical enterprise architectural practices and IT governance practices. They keep a grip on the tools that can be used, which is a little bit at odds with a more bottom-up approach that allows teams the flexibility in what procedures they want to use for specific projects."

And commercial ISVs face "the old business model of having to make money on upgrades, and new releases having to lay on features," Niski added.

Cem Kaner, professor of software engineering at Florida Institute of Technology and director of Florida Tech's Center for Software Testing Education & Research, added that there is no great incentive for ISVs to change this model.

"The ultimate driver of much higher-quality software will be public accountability for bad software," Kaner said. "At this point we have extremely little competition in the marketplace, and we have a federal government that has no interest in enforcing the antitrust laws, which means the software industry gets more concentrated and less competitive every year."

But the problem is not just legacy software, Humphrey said. "Today when everyone is online and you can just bang in stuff, they [programmers] are very sloppy. They don't have the personal habits of going back and reading it over. It's what programmers are taught -- bang out code as fast as you can," he said.

But along with time-to-market pressures, organizations face increased exposure through the Internet, as well as new regulatory and compliance issues -- all of which are upping the visibility of quality issues. "The business side is becoming much more aware of quality, so the pressure to improve quality across the organization is increasing," said Arnon Moscona, director of quality management at Borland Software Corp. in Cupertino, Calif. "The majority are still groping for a solution."

Part of the problem, said Humphrey, is that many organizations are focusing on tools. "People think if you want higher-quality stuff, you've got to get better tools," he said.

Rather, the biggest impact on quality are the people using the tools, and the people they report to, Humphrey and other experts interviewed for this story said. "To get a quality product you need two things: programmers personally committed to producing high-quality stuff and managers and customers that demand high-quality work," Humphrey said.

The time-pressure problem
It's not that individual developers and their C-level executives don't care about quality, but organizations face time/scope pressures and therefore often don't create an environment that actually helps programmers write better code.

Schwaber has trained more than 9,000 programmers in the Scrum methodology and puts them through an exercise in which they are faced with retaining quality but missing a deadline, or dropping quality and hitting the deadline. All but 120 of them, he said, "were willing to drop quality. It's just a knee-jerk habit."

"I think most people care about the quality of their work," Kaner said. "If a decent hiring decision is made, that person probably cares. Does that person believe he can do a good job in this environment is a different question."

For example, Kaner said, his first-year Java programming students are learning test-first programming. "It takes my students longer to write programs test-first than it takes others to write the same program without. As my students get more complicated projects, the habits they're learning today will probably make it possible to write code that is much more reliable and maintainable, but if I run them on a schedule that gives them barely enough time to write and no time to write test code, I haven't enabled them to do it. And as they go through all that extra work, I have to find a way to persuade them that eventually it will be worth their investment."

That's why it's important to get executive buy-in for quality improvements, Humphrey said. "At the lower level, people are under enormous pressure; they're not thinking about the overall benefits," he said. "In most organizations, the things you have to do to improve quality cost one group money and save money for other groups." For example, he said, an added effort in development can cut service costs.

The good news
But it's not all bad news. Planes are not falling out of the sky, business is still being conducted, and critical medical systems are still saving lives. "On the engineering side we've made tremendous progress," Kaner said. "We have programs that run millions lines of code that don't seem any less reliable than programs that used to run thousands of lines of code. They're still annoying, but there's a whole lot more stuff that could go wrong that's not going wrong."

According to 2006 data from The Standish Group, a West Yarmouth, Mass.-based consultancy on IT projects and value performance, 35% of all projects succeeded (were delivered on time, on budget, with required features and functions); 46% were challenged (late, over budget, and/or with less than the required features and functions); and 19% failed (were cancelled prior to completion or delivered and never used).

While 35% may seem low, this is a major uptick in the success rates from the previous study, according to Standish. The low point in the last five periods Standish studied was in 1998, with only 26% of projects succeeding and 28% failing.

The Standish Group study also shows a substantial decrease in both cost and time overruns for 2006. Cost overruns have gone down to 47% from a high of 69% in 1998. Time overruns also have gone down to 72% from a high of 84% in 2004.

Software that's improving
Another bright spot is open source software, said Burton Group's Niski. The quality of open source software "has gotten better and continues to do so," he said. He attributes much of that to "the sheer amount of eyeballs on the software. A lot people are developing it and using it, so it's got the potential for a worldwide code review."

In addition, he said, open source projects do not face the time/budget pressures that commercial or in-house projects do. "You see bug fixes, but you don't see major releases until the features are well-defined and agreed upon and pass muster. It's not done until it's done," Niski said.

Schwaber said some large commercial ISVs are getting some better quality results, too. "They have the biggest reasons to do so," he said. "Microsoft is getting results, Yahoo, Google -- their quality is diminishing less than it was, even large companies like Motorola with their phone OS. They're all working on it, but [the problem] is really entrenched."

The business side is becoming much more aware of quality, so the pressure to improve quality across the organization is increasing.
Arnon Moscona
Director of quality managementBorland Software Corp.

Another bright spot Niski is seeing among Burton Group clients is that the successful adoption of service-oriented architecture (SOA) and business process management is leading to more reuse, which is enabling the use of lighter-weight development methodologies.

"Once you've identified an opportunity for reuse, there is often enough guidance to let teams use lighter-weight, less documentation-oriented methodologies -- stripped down versions of unified process or Agile," he said.

The use of Agile methodologies is also helping some organizations improve quality, although they have "a pretty wide range in how many steps in the life cycle, what the recommend dos and artifacts are, how much modeling goes into development vs. refactoring. But [there are] core themes -- the focus on delivering software and the concept of tuning the methodology for characteristics of the projects," Niski said. "Companies using Agile successfully are tuning for the culture and the kinds of projects they're doing."

In addition, Niski said, there are developments in the tooling space that have the potential to improve software quality, such as static code analysis tools, that scan code as it's being developed for security flaws and code defects as well as intellectual property concerns. "But having a tool isn't enough; it has to be deployed in a smart way," he said.

Niski also said most IDEs come with a unit testing framework, which is "a big quality enhancing tool." Code formatting templates and code templates also have big potential to improve quality, he added. And there are more tools available for doing continuous integration and build, and "a number of commercial tools being built around open source to integrate with version control systems and to do automated testing. That space is heating up a lot right now."

Clearly, even with better tools, there's no simple answer to how to improve software quality, and Kaner points out how broad the challenge is. "Very few people in the quality field define quality as freedom of bugs. A product has quality if it has enough of what makes people want to use it, and doesn't' have too much of what makes those same people not want to use it."

Schwaber is hopeful that the market will help drive change, but with a caveat: "I think what will happen is some places will really get it and will be so competitively compelling that others will have to rapidly change or go out of business. As an offset to that, consider that Ford has known for 40 years how Toyota builds cars."

Dig Deeper on Software Quality Management

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

TheServerSide.com

SearchAWS

SearchBusinessAnalytics

SearchHRSoftware

SearchHealthIT

Close