Developers Held Liable for Software Bugs?

Hold software developers liable for security defects in their products!

Well, that’s what former White House and Microsoft security advisorHoward Schmidt says, anyway.

For sure, I’ve seen so many buffer overflow bugs that resulted in theremote execution of arbitrary code — the penultimate software securitydefect — I’ve wanted to scream. But hold the developers liable? That’s asmall word with huge ramifications in these United States.

No, I don’t believe we’re anywhere near ready to take such a drasticstep. Let’s make sure we point the car in the right direction before wehit the gas.

Allow me to explain.

I’ll bet Mr. Schmidt was thinking about other engineering disciplineswhen he made the suggestion at ISC2’s SecureLondon conference recently.And it’s true in some engineering disciplines, we do hold theprofessional engineers liable for their design failures, particularlywhen public safety is involved. However, we mustn’t forget there’s aworld of difference between the practices in use in software engineeringthan in, say, civil engineering.

When a civil engineer sets out to design a bridge, he calculates theloads the bridge is likely to have to withstand (e.g., cars, trucks,pedestrians, wind, temperature changes). At the end of the analysis, afactor of safety is applied to the estimate and he looks up what sizebeams and such to use for the structure. Neglecting to use the minimumstrength beams exposes the approving professional engineer to liabilityfor the structure’s failure.

In that engineering world, the beam sizes and such are published in theform of structural codes — standards to which the engineers base theirdesigns on. These tables were developed over decades of use and analysis,as well as trial and error. What physics student can forget the filmfootage of the famous Tacoma Narrows Bridge that collapsed due toharmonic loading generated by wind?

Even if one looks at the latest advances in software security bestpractices — and there are several that are worthy of note — we’re a farcry away from any sort of published standards that can hold a candle towhat civil engineers use.

And yes, as I said, there has been significant work done in the bestpractices arena for software engineers to learn from. This includes theDepartment of Homeland Security’s Build Security Ineffort and the Open Web ApplicationSecurity Project. Each of these useful projects include actionableguidance, and I believe they’ll go a long way to improving the overallstate of software security as they’re adopted.

But make no mistake about it: these are not standards, but bestpractices. And there’s a vast difference between the two.

Let’s also not forget that civilization was building bridges for (nodoubt) centuries before it got to a point that it could hold engineersliable for their failures. The best practice efforts cited here need tobe tested, used, and refined for at least several years before they’reremotely ready for that sort of thing.

I also should add that it would be a mistake to interpret what I’m sayinghere as defending shoddy software development practices in any way.Indeed, I find it staggeringly frustrating to see the same mistakes madetime and time again, year after year. The 1988 Internet worm that RobertMorris wrote and launched exploited, among other things, a bufferoverflow in the Berkeley UNIX finger daemon. From my perspective, thatgreatly analyzed and publicized security failure should have marked theend of the buffer overflow, but a quick glance at the headlines is allthat’s needed to correct that misguided expectation.

But, perhaps these repeated mistakes also are the fault of ourcommunity’s reluctance to truly learn from its mistakes.

Here, I like to cite the transportation industry as a prime example. Theydo a spectacular job at studying their failures in painstaking detail andpublishing their results for all to learn from. The software world needsto do a much better job at emulating this practice.

Sure enough, shoddy software security is pervasive and we’ve got todemand better from our product developers. It’s also a huge factor inpreventing us from really getting the most out of the incredibletechnologies that have been developed recently. Secure software should beperceived as a business enabler, not an inhibitor, in much the same way acar’s brakes enable us to drive fast.

But we’re still a long way from even being able to seriously considerholding developers liable. Instead, we should be considering othermeasures, such as public embarrassment, ostracizing, and the like. Heck,some bugs might even warrant public caning.

No, I’m just kidding… sort of. My patience buffer must have overflowed.

Kenneth van Wyk
Kenneth R. van Wyk is an internationally recognized information security expert and author of the O'Reilly and Associates books Incident Response and Secure Coding. In addition to providing consulting and training services through his company, KRvW Associates, LLC, he currently holds numerous positions—the Director of Cigital's Research Labs, monthly columnist for online security portal eSecurityPlanet, and Visiting Scientist at Carnegie Mellon University's Software Engineering Institute.

Top Products

Related articles