Embedded software test: Attack of the killer robots

Embedded software can be found in all devices from planes to pacemakers, but how do we test this kind of software? What are the differences between testing embedded software and traditional application software? In this tip, site editor Yvette Francino talks about a presentation and hands-on challenge given by embedded software quality guru Jon Hagar at a recent SQuAD (Software Quality Association of Denver) meeting.

What is embedded software? What differences or considerations do we have to take into account when we’re testing embedded software rather than traditional application software? Those were some of the questions that Jon Hagar answered at a recent SQuAD (Software Qualilty Association of Denver) meeting. And to demonstrate his points, he allowed for some fun by giving the group a hands-on “attack” assignment used to test embedded software...

within a robot.

What is embedded software?

Hagar started his presentation with some definitions. He defined “embedded software systems” as those that “interact with unique hardware/systems to solve specialized problems interacting with and/or controlling the ‘real world.’”

The difference between traditional IT software and embedded software is that embedded software is usually run on a specific device rather than “generic” hardware.  Because of this, there are often significant hardware interface issues. These can include problems with initialization, noise, power-up, power-down, timers and sensors. There are resource constraints that must be taken into consideration such as RAM, ROM, stack, power, speed and time.  Typically, there is no “human” user interface (UI). It may be more difficult to update or change the software. Embedded software often also can involve risks, hazards, safety, specialized domain knowledge and logic or algorithms that are used to control hardware.

Where do we find embedded devices? Everywhere! Hagar’s presentation listed the following categories:

  • Avionic systems: planes, cars, rockets, military
  • Telecom: switch, routers, phones, cell devices
  • Transportation: traffic control, railroad, trucking
  • Industrial control: lighting, machines, HVAC
  • Medical: pacemakers, dispensers
  • Home and office systems: control, entertainment, TV
  • Smart devices and mobile applications

This is not intended to be a comprehensive list. More uses for embedded software are cropping up continually. Basically, any device that can house a chip is a candidate for embedded software.

Testing embedded software using “attacks”

Hagar described the testing of embedded software as creating “attacks.” He said that embedded system testing included “the process of attempting to demonstrate that a system (hardware, software and operations) does not meet requirements, nor functional and non-functional objectives.”

An “attack” is used to look for common modes of failure and bugs which would show that requirements were not met. He described defects as the “enemy” which we can defeat using “tools, levels, attacks, and techniques.”

Attacks used by software testers are those that are learned by understanding common modes of failure. Through experience with a particular domain, testers begin to know target areas that are prone to failure and can use these to plan their attacks.

With embedded software, two areas that need to be attacked are timing and the use and control of the unique hardware on which the software is running.

Of course, whether you are working with traditional software or embedded software, there are common attacks that can be applied. James Whittaker’s book, “How to Break Software,” includes 23 attacks, including four which Hagar suggests starting with for embedded software:

  1. User interface attacks
  2. Data and computation attacks
  3. File system interface attacks
  4. Software/OS interface attacks

Additionally, Hagar suggested categorizing embedded attacks into the following “divisions”: Developer implementation level, time, hardware, operating system, software, data, UI/GUI, Combinations and applications.

The challenge: Attack the robot

Hagar finished his presentation by giving the group some hands-on experience with creating attacks for a robot which ran on embedded software.

He first told us the requirements, or what he referred to as the “laws and flow” for the robot. It needed to travel in a straight line (+/-5%), move 10 feet, and sound an alarm before moving.

The group was tasked with defining data ranges, designing an attack, remembering to test boundary conditions, filling out a charter, running the attack, analyzing the ‘laws’ for stability and data problems, repeating as necessary, and finally, holding a retrospective.

The charter document was a document that asked what we were testing, asked us to define success criteria, asked what support items were needed and allowed for documentation of steps and then results. Results were to include any bugs, observations, lessons learned, positives, issues, concerns or risks.

Hagar warned us to take into account environmental factors. Unlike software that is running on computers, when testing the robot we needed to consider factors such as wheel alignment, flooring and other external factors that might influence the robot’s ability to operate. We needed tools and resources to be able to measure results. Our test efforts could no longer be simply “programmed” and automated. We were working with a combination of hardware and software.

The retrospective allowed us to discuss our observations and findings. The team brainstormed about the attack and how well it worked to test the system. My personal observation was that I was interested in seeing the software rather than working with a “black-box” system (one in which the code is not exposed.) I would have liked to design tests that exercised the various code paths and without visibility into the code, I didn’t feel satisfied that we were thoroughly testing the system. However the little testing we did showed that the robot veered away from a straight line much sooner than the required 10 feet.

Conclusions

Testing embedded software, like any software, requires a plan of attack. By understanding problem areas of embedded software, such as timing or hardware interface issues, testers will be more prepared to create the attacks which will defeat the “enemy,” leaving your embedded software free from bugs and your devices operating with high quality.

This was first published in March 2011
This Content Component encountered an error

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchSOA

TheServerSide

SearchCloudApplications

SearchAWS

SearchBusinessAnalytics

SearchFinancialApplications

SearchHealthIT

Close