Do you have a sense of where developers are in their original natural state in terms of security?
There are two major buckets on the whole. On the whole, developers in the industry need a bit of help. The actual level of security expertise in the marketplace is abysmally low. I was talking with some academics recently about this -- we were hiring really bright people out of school, and basically they don't know how to build secure software. And it's really scary. With that said, we recognize that's why we have the SDL [ Security Development Lifecycle] process. It is really an education. We have to fill that gap.
I can sum it up with a story. When we had the first major round of security education, we had vice presidents kick it off to show how important it was. One was Rob Short. And he made a comment that has stuck with me for years now. He said: "There is nothing special about security. Don't think of security as something only high priests understand. It's no different than reliability. It is no different than performance. And we have to get beyond the point where we consider it something only the high priests understand." It's part of getting the job done. It's part of the process.
How crucial are tools in this regard?
We use a heck of a lot of tools. I love tools, and I am also very wary of tools. I have heard people say the industry will get better when we have better tools. I think that is folly. The industry will get better when we have more developers and engineers in general who understand this stuff. There is no replacement for human insight here.
But, for me tools are kind of threefold.
One, they help scale, which humans don't do very well. If you have a million lines of code, you can scan that really quickly with a tool. Let's be honest: Reviewing code is slow and tedious, and it's mind-numbingly boring.
Two, tools help preserve policy. With Vista [development] we implemented the Security Quality Gate. [This can, for example, perform static analysis, search for unmanaged code, verify certain string buffers, etc.] It's a bunch of tools that run automatically when you check the code in.
Three, tools help enforce policy. The tools look for the bugs, but really what the tools represent is the policies. They help you understand a new bug type and how prevalent that bug might be in the code.
So, how do you ensure your process is tight with security?
There are two golden rules. First, admit you have a problem, which I don't see many people doing. I see a lot of security bugs every day across the industry. Number two, get the senior executives to agree to do this, because if you don't get the senior exec to sign off on this, you are never going to make real progress. If you can't get over the two hurdles, you are never really going to make any headway.
You've talked about developers. I was wondering about the architect -- the person setting the policy.
Well, they still have to get past admitting there is a problem. And that can be a real bitter pill to swallow. How we started here was a small group of architect types recognized we had to do this. But we really made our big strides when we got Bill Gates to agree to this.
We've done a lot of engineering around improving the design of the code. There is also a bunch of defenses we've added to the operating system. But we have to get everyone writing applications to adopt those defenses or the application will be a weak link.
How does one get going in reviewing their present designs?
If you are new to this, I would do a minimum of four things, with the caveat that everything in the SDL is there for a reason; it's not just there because we thought it was a cool idea. With that said, if I was getting going with security stuff, here is what I would do:
Number 1 -- Education. You have to be sure everyone understands the basics.
Number 2 -- I would adopt some kind of analysis tool. I would not rely on it, but it would be a good indicator of where troubles can be.
Number 3 -- Threat modeling. You have to decompose your application to understand where potential threats are.
Number 4 – Fuzz testing. It's all about building malformed data and seeing how apps respond to it. If you have never done it before, you will find bugs. Absolutely.
Any last words?
One thing I have noticed is that more people are asking about the security stuff. This needs to be part of everyone's applications. If you build software and hook it up to the Internet, you will get whacked. It is a hostile environment out there.
Michael Howard, senior security program manager at Microsoft, is one of the individuals charged with spearheading the company's multiyear effort to become a paramount example of secure software development. He is coauthor of Writing Secure Code, Second Edition [Microsoft Press, 2002], 19 Deadly Sins of Software [McGraw-Hill Osborne Media, 2005] and Writing Secure Code for Windows Vista [Microsoft Press, 2007]. He blogs most eloquently on Michael Howard's Web Log, which owns the subtitle: "A Simple Software Security Guy at Microsoft!"