By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
The expression "everything old is new again" could apply to security vulnerabilities in the world of Web 2.0 technologies and service-oriented architecture (SOA). According to a number of security experts, Web services and Ajax applications have not given rise to new classes of security vulnerabilities, but rather new ways to attack applications and a larger attack surface, creating challenges for both developers and testers.
"Look at Web 2.0 technologies, like Ajax, Web services, news feeds -- we're not seeing those technologies themselves create new classifications of vulnerabilities. It's the same old cross-site scripting, etc., but there are entirely new ways to attack," said Michael Sutton, security evangelist at SPI Dynamics in Atlanta.
Web 2.0 technologies "greatly increase the attack surface of the application," he said. "Historically there were only so many inputs to an application. Now there are all these input vectors."
On top of that, the newness of these technologies is causing some development organizations to forget what they know about security, said Brian Chess, chief scientist at Fortify Software in Palo Alto, Calif. For example, he said, while SQL injection vulnerabilities are decreasing in commercial-grade Web code "because they're pretty easy to prevent, I see more in SOA/Web services code because people haven't woken up to the problem yet."
Since Web services are newer, people are less familiar with what can go wrong, Chess said, but it also "seems to cause people to forget all the lessons they learned. They think if it isn't a Web browser on the other end, it must be friendly. You don't see that basic mistake in a traditional Web application. It really goes back to all the same stuff we had in Web 1.0 -- the number one mistake is trusting the input they receive, which turns into a SQL injection, buffer overflow, cross-site scripting, etc."
From a developer's point of view, Chess said, "If you're looking at code, it's just code." So in terms of using a source code analysis tool such as Fortify's, very little changes, he said.
"The tools have to know about libraries and APIs; those are different in Web services applications, so it wouldn't be fair to say you do things exactly same, but it's more like supporting a new interface vs. a new programming language. So there are small changes, not big ones," Chess said.
Who owns the code?
Another issue in Web services applications is who "owns" the code. It may include components that belong to a business partner or third-party source. And there lies a potential problem.
"You can't scan what you don't own," Chess said. "It's a question of doing good input validation, even with someone else's component. Even though you partnered with them, you still have to assume that one day they may be the vector the attack is taking to get to you. More often than not, developers are painfully aware they write bugs and their code is imperfect, but a lot of times they assume the code somebody else has written is perfect, and in reality it's just as fallible."
Both Chess and Sutton agree that even in a Web 2.0 world, at the simplest level good application security boils down to input validation.
"The whole concept behind SOA is loosely coupled, where many entities are talking to one another," Sutton said. "Say I'm drawing a news feed from third-party site -- it's still untrusted content unless you make it trusted. What if that third party was compromised? So you have to validate input everywhere, even from third parties. In a traditional application you own every piece, so you don't have that challenge."
Challenges for software testers
While there may not be any "new" security vulnerabilities per se, "it's a totally different world," Sutton said, particularly for testers. "If the tools and people don't understand the architecture, they're not going to be able to test it. From a QA perspective, for example, the first challenge is there's no UI."
Sutton said the concept of inter-application testing in a loosely coupled architecture also raises testing challenges. "The whole beast of the application exists in many locations -- where does mine end and others begin? Can I adequately test this? Maybe one piece of the architecture has no vulnerabilities, but maybe small weaknesses in two different places make one big weakness."
And situations will eventually arise where organization are going to need to test applications in conjunction with partners, "which opens a new ball of wax for the QA tester," Sutton added.
Legacy applications also pose a potential security challenge for testers, Sutton said. "Typically SOA involves not starting from scratch; typically you take some legacy application and layer an interface over it so you can interface with it in real time over the Web. That brings another challenge -- Web vulnerabilities like cross-site scripting and SQL injection weren't even worried about when that [legacy] application was built."
To meet the new testing challenges created by Web 2.0 technologies, SPI Dynamics rearchitected its product line to have more intelligence to see the entire application. SPI's Web application scanning tool, WebInspect 7, for example, now features testing innovations such as simultaneous crawl and audit (SCA) and concurrent application scanning.
WhiteHat Security, a vulnerability assessment and management service for Web sites based in Santa Clara, Calif., also adjusted its service and methodology to support Web 2.0, specifically Ajax and Flash applications.
Currently, WhiteHat Chief Technology Office Jeremiah Grossman said there has not been a big enough demand yet to support the assessment of Web services. He said very few Web sites have publicly deployed Web services, and those organizations that have deployed Web services internally or in a business-to-business application consider the risk of attack lower than on a public Web site.