|Dr. Bill Curtis, co-founder of CMM & CMMI|
(Interview with Bill Curtis continued from CMMI:
Good process doesn't always lead to good quality)
How has the approach to quality evolved over the years?
Think of the history of software and application development in a series of waves, in terms how we tried to improve it. Back in the 1970s everybody was focused on languages — software is going to get a lot better because we have these higher-order languages. That did make it better, but it didn't solve all the problems. Then in the '80s we were going to solve all the problems with new design methods and much better tools, so everybody spent lots of money on CASE tools. That certainly helped, but again, it didn't solve the problem. Then we realized the real problem was giving people commitments they can't meet. We realized the process is out of control. So in the 1990s that was the third wave, which was get the process under control.
Now the next phase, the phase we're seeing develop in the 2000s, is to focus on the product —
really being concerned about the overall quality of the architecture and the way the system is
constructed so that we reduce the cost of ownership. And just as important to business, that we
make the system sufficiently agile so that you can make changes to it very rapidly. That's what the
business wants. They need to be able to rework a business process rapidly to meet the competition
in the market. If you've got a big clunky system that's not architected well, it's extremely
difficult to make enhancements or modifications or changes. How does this next phase of quality
improvement fit with your coming to CAST?
One of the big focuses we have in CAST is to help identify the architectural/quality attributes of a system that make it cheaper to own over its lifecycle and allow you to enhance it, modify it, grow it so you can do business at the speed of the competition.
The position we take at CAST, [process] is one issue in quality, but I've got all these issues. If the system is extremely complex, it will be expensive to maintain; that's a quality problem. Are there ways the system has been architected that under certain workloads it will cause a reduction in performance? That's a quality issue.
We see quality as an entire architecture and construction of the system. It's the entire end-to-end system with all its various components and how that comes together to deliver what the customer wanted. That's the quality challenge. Our technology digs into this to give feedback about what quality it has and what it can expect in terms of performance. We've heard people complain about 'I've had a Level 5 contractor, how can I possibly have defects?' You can have them because that's not a quality standard; it's a process standard.
CAST was exciting to me because the technology really gets feedback to the development team -- about the quality of the work they're doing, about how decisions they're making may have impact on other parts of the system, where things are too complex or where they may have a performance impact. I see that as the next wave in trying to help the business take full advantage of its investment in IT. With a better architected system you can support a more agile business, because you can modify the system so much faster to meet business challenges.
The real issue is, let's look at how this house got built to make sure it can withstand the
tornado. It's like doing architectural analysis on a physical structures—what kind of stresses can
it withstand? What are your plans to work toward a software quality standard?
We have some international standards that provide definitions for quality metrics that look at various aspects of a system, and we have ways to agree on how we count and evaluate the attributes of systems. But there's really no standard that says, "If it's beyond this number you've got a problem." We don't have a quality standard to certify or evaluate software against. One of the things I want to do is pull together some leading experts from around world to pull together that knowledge and set up some structure. It could be a standard -- it could be a number of things -- that would represent something that you could benchmark or certify software against, to say this piece of software has the attributes of a well-constructed piece of software based on the following principles.
We're just getting started. We'll start assessing the interest of folks I know around the world
who have opinions and data. And we've got to find a neutral body that's an appropriate place to
house such a publicly available standard or benchmark. How would you gauge the progress the
industry has made in terms of quality?
There's no question we're better off. There's more known about software engineering and how to do good testing. There are better tools out there. But the problem is, as we get better the demands on the system are greater, the requirements get more extensive, the size of the systems gets larger, and more and more systems are hooked together and interacting with each other. So the potential for defects that we've never even conceived of is much greater. Every time we make a gain there's some kind of challenge that swallows up some of that gain.
You could look at that and say we haven't made much progress, but that's not fair. I think we've made a lot of progress, but we've also got a lot of challenges that we didn't have years ago. Now we've got these mega-million-line systems that are interacting with these other mega-million-line systems. Think of the problems a company like Microsoft has trying to test an operating system that will support software that they don't even know is going to be out there yet, but they're going to have to be able to support it. The same thing with telecommunications. You can build something and have no idea all the stuff it has to interact with, and that's just growing constantly. In application development the technology is turning over so fast, and the supply of people who know this stuff is not good. So there are a lot more challenges, but as a field we've made dramatic progress.
Dr. Bill Curtis, a globally recognized expert in software process and quality, recently joined CAST, an automated application intelligence vendor, as senior vice president and chief scientist. Curtis co-authored the Capability Maturity Model (CMM), the People CMMl, and the Business Process Maturity Model. He was co-founder and chief scientist of TeraQuest, a provider of CMM-based services acquired by Borland. And he is a former director of the Software Process Program in the Software Engineering Institute (SEI) at Carnegie Mellon University.