Glitch author seeks mandated software quality controls

In Part 2 of this SSQ interview with Glitch author Jeff Papows, we learn more about Papows' proposal for an IT Governance Manifesto which would mandate higher standards of quality for life-threatening software. Papows warns of the dangers of not taking quality seriously, saying we could be on the precipice of a digital Pearl Harbor.

Read part one.

In chapter two, you talk about a proposal an "IT Governance Manifesto" in order to lobby government to pass legislation mandating higher standards of quality for certain types of software. Will you tell us more about that?
Not that I'm generally advocating more government intervention or control over our lives because God knows it seems like by and large we have plenty, but to the point of one of your first questions, we're reaching the point where the consequences of some of these "glitches" are not only life threatening, the reality is the ubiquity of technology has so penetrated the fabric of our social and cultural existence that the problems that result are of such a serious consequence to both the economy and our lifestyle that the notion that we wouldn't reach a point, where ultimately we had to mandate a level of quality is probably optimistic. And the more of this stuff that appears in the broadcast news cycle like situations like Toyota where the brand damage is so visible, I think the more likely it is that the government chooses to get more proactive.

One of my suggestions in the book is that the industry police itself and create a broad fabric of best practices and policy management that would reduce the propensity and the rate and pace of the damage being done by these technology snafus. Before the federal government steps in and does it, which ultimately will happen. There's always this discourse between Internet time, which I tend to think of as a multiple of clock time and government time which tends to lag behind clock time, so it's hard for the federal government to move at a rate and pace consistent with the Internet, but I wouldn't put it past our legislators to ultimately feel required to deal with the kind of Toyota-scale disasters. Aren't there already a lot of regulations in place for life or death type of devices? For example, I know there are a lot of regulations with medical devices.
There are a lot of regulations in place as it relates to manufacturing and whatnot. There are not a lot of regulations in place that relates to underlying software. Take the example of the Varian glitch that I talk about in the book, where you had people with fairly minor variants of tongue cancer and stuff being killed because of the software glitch that was mis-targeting the area of the body that people thought the machines were radiating, and then multiplying the dose in a tragic way. There are no federal regulations that require that particular company, in this case Varian or competitors like them, to pass any sort of perceived quality standard and the fact that that went on in this day in age, in this country, on that scale, for the better part of a year before that software glitch was corrected is ample evidence that there are safeguards that need to be in place. Ironically, we have a federal agency that mandates the quality controls on our food supply, but if you think about the implications of technology gone awry, I think you can logically argue, why don't the same safeguards exist? I think part of the answer is complexity, but one has not caught up with the others yet. What would you say are the biggest takeaways readers will get from the book?
I think the first thing is, you've heard this expression that we can't see the forest through the trees, I think we're so immersed in technology not just those of us inside the industry, but consumers as a whole, that we still tend to have this mythical notion that computers are infallible and to think of these things as sort of non-problematic until the headlines say otherwise. So the first thing I'm trying to do is provide a wake-up call to make it clear that we could be on the precipice of a digital Pearl Harbor if we don't take this problem more seriously. The second thing I'd like IT professionals to understand is the scale and magnitude of the problem, partly because of the ubiquity of technology now,is not a problem we're going to solve by just working harder or throwing more bodies at the problem. It's going to require the same kind of innovation and automation that gave credence to the growth of the IT industry as a whole to solve this problem. We can't put the genie back in the bottle and make things less automated; we, in fact, have to make them more automated to have the quality controls to avoid these kind of disasters.

Dig deeper on Building security into the SDLC (Software development life cycle)

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchSOA

TheServerSide

SearchCloudApplications

SearchAWS

SearchBusinessAnalytics

SearchFinancialApplications

SearchHealthIT

Close