How much foresight must engineers have? At what point do threats become absurdly remote? The questions arise, as...
I look at an item that recently crossed my desk. It provides a view into a future in which application security will endlessly enter uncharted regions.
"Pacemakers and Implantable Cardiac Defibrillators: Software Radio Attacks and Zero-Power Defenses" describes a study that University of Washington and University of Massachusetts researchers undertook to measure the security and privacy properties of implantable defibrillators that support radio-based reprogramming.
The story got some coverage. Why not? A new generation of pacemakers that can be reprogrammed without invasive surgery is found to be a ticking time bomb.
Open source software played a role. The intrepid researchers reverse-engineered a radio-programmer's transmissions and built an eavesdropper using the Universal Software Radio Peripheral (USRP). They added GNU Radio libraries to capture and store received signals.
The researchers were able to intercept and analyze heart -- and other -- data. And they were able to launch attacks on medical devices. Don't worry; the attacks were launched against devices embedded in bags of hamburger and bacon, not living humans.
As a friend of mine pointed out, encryption should help protect the enhanced heart system against radio-based attacks. Memory and CPU cost being what it is these days, that should not be a big deal.
Software engineering is still a young profession. I don't know of any cases where a software engineer served time for dereliction as I assume is the case in more mature professions such as civil engineering where the law sets some of the project requirements.
Still, embedded system engineers have been held to high standards for a long time, and the challenges of building safe computer-centered systems is not alien to them. Like civil engineers, by and large, they hold safety as the overriding system requirement.
We see the question raised of how much engineering foresight is reasonable in the aftermath of the Sept. 11, 2001, airplane attacks on the World Trade Center in New York. The topic is not without controversy. Fire and heat raged through two modern buildings and caused fantastic collapse in a very unexpected way.
But my feeling is that the designers of the Twin Towers showed a lot of foresight. They worried about plane crashes. The towers were built to withstand the impact of a jet airliner. Still, it seems fair to me to say the buildings' architects should not have to had to anticipate impacts of fully fueled cross-country bound jets with terrorists at the controls. Today's architects must, however, calculate to avert such catastrophe.
Like humankind, application security engineering will continue to evolve. Is the typical pacemaker recipient likely to be a target of a GNU Radio hacker? Probably not. If not, how much consideration does such a possibility warrant? How much should software engineers worry? What do you think?
You can send your comments to me at Editor@SearchSoftwareQuality.com.
Dig Deeper on Building security into the SDLC (Software development life cycle)