It's somewhat like 'machines gone wild'.
What we've designed to make our lives easier has somehow turned againstus. Though most would expect it, application security is definitely not abuilt-in. When people think of application vulnerabilities, they oftenthink of Microsoft. But the truth is that application vulnerabilitieshave been present since the beginning in all software. We only notice itin Microsoft's applications, because it's everywhere.
We're buying insecure software, installing it onto our networks, sellingit to our customers, and patching it time and again. With each newsecurity incarnation, we run the risk of introducing a new vulnerabilityto the network.https://o1.qnsr.com/log/p.gif?;n=203;c=204660766;s=9477;x=7936;f=201812281312070;u=j;z=TIMESTAMP;a=20392931;e=i Someone has to take the blame for what we've done to ourselves.
Is it their fault or ours?
''I call it a haystack-full-of-needles problem. It's hard to understandthe magnitude of the problem until you understand the millions of linesof code we have as a nation and as a world in all of our collectiveapplications,'' says Jeff Williams, CEO of Aspect Security, a securitycompany based in Columbia, Md. ''You can't tell a good piece of softwarefrom a bad one very easily. There's a limited number of people in theworld who can pull apart source code and say, 'Yeah, this is secure,' ornot.''
So can we blame the companies that build it?
If we do, then we need to look at the developers of the applicationsthemselves. The problem with that is most developers of the end productaren't even part of the companies we're trying to blame. Most areanonymous coders in far-flung locations with a set of specs and adeadline. If they're building what we've asked them to, how are they atfault?
''You can't really blame the developers,'' says Bill Leavy, vicepresident of marketing at Parasoft, a Monrovia, Calif.-based company thatspecializes in tools to prevent software errors. ''It's a lack ofdefinition. If you haven't defined what your expectations are from youroutsourcer or your internal development team, then it's really thecorporation's fault for not having an established security policy.
''Bringing in material from an outside vendor requires you to havestandards that the vendor needs to meet,'' he adds. ''Security standardsare criteria for accepting the application.''
Then are we back to pointing fingers at the vendors?
''I do think the folks ultimately responsible are the folks producing thevulnerable product, and that would be the software community,'' says MikeArmistead, co-founder of Fortify, a software solution company based in PaloAlto, Calif. ''I believe they're the owners because a worm or a viruscan't attack a flaw that's not in the software. If you outsource yoursoftware, you should have criteria about its security when you do theacceptance test. Today, that doesn't happen.''
An economic promise, whether true or not, is that first to market getsthe market share. In our highly competitive landscape, the lure of beingfirst on the shelf overpowers the desire to create something that works.Not just merely functioning, but functioning securely with our bestinterest in mind.
''Application security is more of an ongoing lifecycle,'' says VikramDesais, CEO of Kavado, an application security company based in Stamford,Conn. ''There are thousands of applications written every day andapplications are rewritten, improved and expanded. Every time a humanbeing touches one of these applications, because we're not perfectourselves, we inadvertently unleash new vulnerabilities. It might workbetter from a performance perspective but it might be much morehackable.''
OK, so is it the consumer's job to step up to the plate and start fixingthings?
''What we need is the buyers of software -- mass consumers, people whobuy from outsourcers, anyone who is buying it -- to take responsibilityand say, 'Here's what I need the software to have in it from a securityperspective,','' says Aspect Security's Williams. ''At the broadestlevel, the buyers and the sellers need to have a conversation and maybethe buyers need to say, 'I don't want any buffer overflows,' and thesellers need to say, 'Well, that's going to cost you,' or whatever it isthey say. But that's how we're going to fix the market. That conversationneeds to start working.''
Yet it's even more basic than that.
Application security isn't a mystery. There's a deep knowledge base ofvulnerabilities and a history of how they're making it into our networks.
''We have around 40 different categories of vulnerabilities identifiedlike the rock stars of our era -- SQL injection, buffer overflows,etc.,'' says Armistead. ''If you just took care of the first two, that'sa great baseline to start from. We really do have a list to work from onthings that we can fix. Yet hackers will figure that out too, just likewhen we shut off their ability to telnet onto machines. If we shut downbuffer overflows, they're going to move on. We have to be ready for that.But, absolutely, there's enough there that everyone can start fixingthings today.''
So it's actually a communal problem. We need to question why, with aknown set of vulnerabilities widely available for anyone to examine,aren't companies requiring these holes to be fixed?
As mass consumers, we need to set security criteria as a requirement. Andas a community, we need to refuse to buy software riddled with holesthat, at the very least, expose our very livelihoods.