Patch as Patch Can: All Software Is Flawed

leadership team img1

By Michelle Drolet

Founder & CEO

Ms. Drolet is responsible for all aspects of business for Towerwall. She has more than 24 years of,

Read More

Many IT departments have weak patching processes – especially on the client-side. And it’s no wonder – patching is tough. Across all industries and platforms, the Window of Exploit (WOE) – that is, the time lag between announced discovery and the availability of a patch – for web-based vulnerabilities is a whopping 233 days, according to WhiteHat Security. This leaves your organization exposed for an unacceptably long period of time.
It may not be glamorous, but a meticulous patching program is necessary to prevent server and client-side exploits. HP’s DVLabs and other research based on Open Source Vulnerability Database (OSDV) data found that several of today’s successful “Top Ten” vulnerabilities were discovered (and patches were released for them) in the mid-2000s. Yet they continue to be exploited by attackers. Can you say with certainty that none of those vulnerabilities linger in your organization? How do you know?

While many software publishers don’t bother to release patches, the two most aggressive vendors that are religious about patching are Microsoft and Adobe. Ironically, they somehow still account for the majority of client-side vulnerabilities, with the Office Suite products and Adobe Flash Player and Reader topping the list.
Even if you have the world’s best patching process, your organization must strictly enforce policies to prevent re-introduction of vulnerabilities into your environment.
Case in point is Conficker and its infection of millions of unpatched systems since 2008. Three years after Microsoft issued a patch against the flaw, the worm is still looked upon as the most commonly encountered piece of malicious software, representing 15% of all infection attempts (as seen by Sophos customers) in the last six months.
What’s happening is that plenty of infected PCs are spreading the contagion because too many of us are not patching. Apply patches consistently and you will be protected. But the constant noise of Conficker rebounding off network defenses is hiding some of the quieter and more targeted threats.
“By the end of 2011, Conficker was still the largest network threat in the world,” says the most recent Sophos Security Threat Report.
Hand Microsoft credit for taking responsibility and for its transparency. In its own TechNet blog, the company admits with not an iota of ambiguity that “software itself is never completely secure.”
It makes a case that we have all heard before but is worth repeating, namely that security management is a strategy and must be dealt with persistently. There is no complete solution and the work is never finished. There is no gauge to tell you that your network or systems are now secure or not secure. And it doesn’t help to simply add more solutions to the stack.
SecureList and Kaspersky Labs researchers agree that the average PC has at least 12 vulnerabilities at any given time. No matter how well your organization manages patching – particularly on the client-side – and enforces policies, you are likely to see common vulnerabilities reintroduced into your IT environment. You are never totally secure. There is never a point when you can say the infrastructure is secure and walk away. The TechNet post asks, “Why can’t you be 100% secure?” and gives the following reasons:

  • Because people are involved
  • Because users make mistakes
  • Because administrators also make mistakes
  • Because systems don’t always get updated when they should
  • Because software itself is never completely secure

This is a fundamental concept that needs to be understood. There are too many variables and too many dependencies. The take-away lesson here is this: a false sense of security can be your worst enemy.
By Michelle Drolet, founder and CEO, Towerwall

This article was recently published in SYS-CON Media