In 2010, the Stuxnet virus gained much attention. To date, Stuxnet has been the most complex, advanced and destructive computer virus. Stuxnet bypassed traditional security controls including network segregation, code signing and anti-virus software. Stuxnet also carries with it a political connotation. In this article, we will ignore the political discussion and focus on four lessons testers can learn from the Stuxnet virus.
It’s impossible in a short article to describe Stuxnet fully. A high-level background is necessary in order to understand the attack. Stuxnet spread itself via soclal engineering as well as the exploitation of numerous known and unknown zero-day attacks in Windows. The virus performed its task by utilizing stolen certificates and modifying DLLs. It evaded detection using much of the same habits.
The Stuxnet virus teaches testers the importance of physical security, up-to-date patching (and test strategies to minimize the expense of patch compatibility testing), secure design including the “defense in-depth” layered approach to security and secure implementation.
The first lesson in Stuxnet is the importance of physical security. Stuxnet modified various trusted binaries on target machines in order to spread itself as well as modify readings from sensitive instrumentation. Modification of these DLLs was possible because the attackers physically stole digital certificates which they used to sign the
Testers also need to understand and advocate for the importance of patching. Stuxnet was spread in part due to known OS vulnerabilities, including the vulnerability made famous by the Conficker virus. This vulnerability had been patched months before, but many machines remained unpatched. Organizations are often reluctant to update and accept OS-level patches because they worry about incompatibilities. Testers need to develop and implement test strategies which reduce the expense of compatibility testing, to remove organizational barriers to continuous patching. They need to be advocates for customers’ needs to operate in up-to-date patched environments, and hold the line for quality.
An underlying principle of secure software is the concept of security via layers. Stuxnet is an example of what happens when security is not layered. The virus was introduced to victim organizations via a number of unprotected layers. The initial distribution is assumed to have been accomplished via physical USB devices which were scattered around laboratories. As these infected devices were plugged in, unsecured operating systems granted users too much privilege, ignored known security vulnerabilities, and supported the virus’ spread on internal networks. The lack of appropriate data loss prevention technologies allowed the virus to jump across a network “moat” (a logical separation between outward-facing networks and a more secure internal network where lab equipment resided). The virus spread further due to insufficient patching strategies.
At each layer in the virus’ lifecycle, appropriate security could have prevented its spread. Many organizations are satisfied with what I call “single layer security strategy” where, once an initial mitigation is implemented, no further work is performed. This single layer approach simplifies the job of an attacker – once the first layer has been defeated, subsequent layers are easily navigated. Testers have a responsibility to lobby for a layered security approach, thereby helping to assure confidentiality, integrity and availability of data and systems produced by their organizations.
Finally, testers need to lobby for and ensure the secure implementation of application code and features. One of the most damaging aspects of Stuxnet was how it exploited a known vulnerability in Siemens’ “Step 7” software. The software contains functionality allowing it to execute scripts when a file is opened. This same vulnerability existed in Microsoft Office products and was patched in the late 1990’s, but Siemens failed to learn from Microsoft’s vulnerabilities. This scripting capability was exploited to deliver the virus and execute attacks. Siemens’ software was implemented insecurely.
Software testers are the voice for quality, and thereby the voice for security, for an engineering organization. It is partly their responsibility to drive the conversation around secure implementation, reviewing features, attack surface area and code implementation to ensure common security errors are not included in their company’s software. Numerous resources exist to help organizations in the effort to learn from the past and avoid others’ mistakes (most predominant among these is the Open Web Application Security Project, or OWASP, found at http://www.owasp.org). By following the strategies and lessons learned from such resources, testers can contribute to and help lead the efforts to ensure secure implementations.
By doing their part to protect their company and software from physical threat, to ensure software is properly patched, to secure design and to secure implementation, testers contribute to a company’s security stance. Stuxnet is an effective example to learn from and to convince management and team members about the importance of these security lessons.
This was first published in September 2011