The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Patching is Not Security

Share:

I have long argued that the ability to patch something is not a security “feature” — whatever caused the need to patch is a failure. The only proper path to better security is to build the item so it doesn’t need patching — so the failure doesn’t occur, or has some built-in alternative protection.

This is, by the way, one of the reasons that open source is not “more secure” simply because the source is available for patching — the flaws are still there, and often the systems don’t get patched because they aren’t connected to any official patching and support regime. Others may be in locations or circumstances where they simply cannot be patched quickly — or perhaps not patched at all. That is also an argument against disclosure of some vulnerabilities unless they are known to be in play — if the vulnerability is disclosed but cannot be patched on critical systems, it simply endangers those systems. Heartbleed is an example of this, especially as it is being found in embedded systems that may not be easily patched.

But there is another problem with relying on patching — when the responsible parties are unable or unwilling to provide a patch, and that is especially the case when the vulnerability is being actively exploited.

In late January, a network worm was discovered that was exploiting a vulnerability in Linksys routers. The worm was reported to the vendor and some CERT teams. A group at the Internet Storm Center analyzed the worm, and named it TheMoon. They identified vulnerabilities in scripts associated with Linksys E-series and N-series routers that allowed the worm to propagate, and for the devices to be misused.

Linksys published instructions on their website to reduce the threat, but it is not a fix, according to reports from affected users — especially for those who want to use remote administration. At the time, a posting at Linksys claimed a firmware fix would be published “in the coming weeks."

Fast forward to today, three months later, and a fix has yet to be published, according to Brett Glass, the discoverer of the original worm.

Complicating the fix may be the fact that Belkin acquired Linksys. Belkin does not have a spotless reputation for customer relations; this certainly doesn’t help. I have been copied on several emails from Mr. Glass to personnel at Belkin, and none have received replies. It may well be that they have decided that it is not worth the cost of building, testing, and distributing a fix.

I have heard that some users are replacing their vulnerable systems with those by vendors who have greater responsiveness to their customers’ security concerns. However, this requires capital expenses, and not all customers are in a position to do this. Smaller users may prefer to continue to use their equipment despite the compromise (it doesn’t obviously endanger them — as yet), and naive users simply may not know about the problem (or believe it has been fixed).

At this point we have vulnerable systems, the vendor is not providing a fix, the vulnerability is being exploited and is widely known, and the system involved is in widespread use. Of what use is patching in such a circumstance? How is patching better than having properly designed and tested the product in the first place?

Of course, that isn’t the only question that comes to mind. For instance, who is responsible for fixing the situation — either by getting a patch out and installed, or replacing the vulnerable infrastructure? And who pays? Fixing problems is not free.

Ultimately, we all pay because we do not appropriately value security from the start. That conclusion can be drawn from incidents small (individual machine) to medium (e.g., the Target thefts) to very large (government-sponsored thefts). One wonders what it will take to change that? How do we patch peoples’ bad attitudes about security — or better yet, how do we build in a better attitude?

Comments

Posted by richardstevenhack
on Sunday, May 25, 2014 at 10:34 PM

The software and telecom industries will not change adequately to address the problem until corporations and end users are being BURIED under an avalanche of cybercrime such that government is forced to threaten regulation. And that is coming, both the crime and the threat.

And even then, until those industries start to develop TRUE “engineering” methods - instead of the pseudo-engineering and simple craft art they use today to develop systems - they will be unable to respond to the problem effectively, regardless of regulations.

Posted by cloud
on Thursday, June 5, 2014 at 10:24 PM

Why the worm has been dubbed The Moon? because of a number of lunar references made in code strings that could be part of a command and control channel.

Leave a comment

Commenting is not available in this section entry.