For years embedded software testing was a world unto its own. The same manufacturer that designed and built the...
hardware also wrote and tested the software that ran on the embedded device or system.
Intended for use in the avionics, automotive and medical industries where safety is critical, traditional embedded systems were anything but trivial. But from another standpoint, they were simple. They were closed systems, designed to operate in isolation, with no need to share information with other applications. Testing mattered, of course. But because the company that built the hardware also wrote the software -- and because outside variables were unlikely to enter into the mix -- embedded software testing was a reasonably straightforward, predictable process.
But now embedded devices and systems -- from the fitness trackers on our wrists to the sensors in our cars that tell us the tire pressure is low -- are everywhere. No longer operating in isolation, they are in constant communication other devices and systems. Fitness trackers wirelessly transmit information, telling apps on our smartphones how many steps we have taken. Dozens of sensors and computers within a single car communicate with one another and with the car's central computer system.
This ubiquity has profound implications for embedded software testing. "The sheer volume of potentially sensitive data stored and communicated by embedded devices has begun to draw the interest of hackers and other criminal entities," said Chris Rommel, an executive vice president at VDC Research in Natick, Massachusetts.
Security is not the only issue QA pros face when confronting embedded software testing for the first time. In this article, Rommel and another expert discuss the ways in which embedded software testing differs from testing Web, mobile and desktop applications, and explain what testers need to know before they take on embedded software testing projects.
Internet of Things: Inherently insecure
The so-called Internet of Things, where objects, people -- even animals -- rely on unique identifiers to transfer data without human intervention, is an inherently insecure place. "In the past, embedded software was impervious to attack problems but it didn't have enough of an attack surface," said Arthur Hicken, evangelist for Parasoft, a Monrovia, California-based maker of testing tools. "Each device was on its own. But now these devices are all talking to each other." A typical car may have 100 subsystems that communicate with one another, he said. "If enough sensors give off enough data, the airbag will trigger."
Rommel said that a wide range of embedded devices have already been targets of malicious attacks: "An alarming amount of corporate, consumer and personal data has been exposed due to insufficient security precautions." To counteract this risk, he recommends that testers subject embedded software applications to three types of security testing: static analysis, penetration testing and fuzzing.
Embedded software testing: A higher standard
One way that embedded software testing projects differ from other test projects is that they demand a higher level of code coverage. The deciding factor for most software releases is date, not the degree of quality, Parasoft's Hicken said: "When the release date arrives, we're happy with 70% code coverage." Embedded software test projects require code coverage of 90% to 95%. And, of course, if the embedded application is safety-critical, 100% code coverage is needed, he said. "Context is everything."
Why is higher code coverage important in embedded testing? Compared with mobile and desktop applications, embedded applications are painful to update because of their limited user interfaces. For example, adding new maps to a GPS system in a car involves downloading the maps from the Internet onto a USB device, then plugging that USB device into the car's navigation system. "It's easy to patch a mobile or desktop application, and we all do it weekly," Hicken said. "But fixing stuff in embedded is harder and the stakes are higher. You have to get it right the first time."
The ability to update an embedded application like a navigation system is obviously important and is part of what makes it useful. But this very ability is also what makes the application and the systems it talks to vulnerable to attack. Hicken recommends using digital signatures to ensure that code coming in through USB ports is legitimate. "The patch needs to be signed, secure and encrypted," he said. "Many, many embedded applications do no checking at all. They assume that if the patch executes, it is valid." But that is risky because an attacker could insert malicious code through a USB port, which in theory could cause the car to fail.
No tolerance for errors
In safety-critical embedded applications, such as a car's braking system or fuel system, the tolerance for software errors is, of course, zero. "What if you're travelling at 70 miles an hour and the fuel system fails? A really bad crash could occur," Hicken said. "What is the cost of killing someone?" Automobile makers know that cost, and now maybe software testers need to know it too, he said.