The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

CERIAS Blog - March 2007

Page Content

The Vulnerability Protection Racket

TippingPoint’s Zero Day Initiative (ZDI) gives interesting data.  TippingPoint’s ZDI has made public its “disclosure pipeline” on August 28, 2006.  As of today, it has 49 vulnerabilities from independent researchers, which have been waiting on average 114 days for a fix.  There are also 12 vulnerabilities from TippingPoint’s researchers as well.  With those included, the average waiting time for a fix is 122 days, or about 4 months!  Moreover, 56 out of 61 are high severity vulnerabilities.  These are from high profile vendors: Microsoft, HP, Novell, Apple, IBM Tivoli, Symantec, Computer Associates, Oracle…  Some high severity issues have been languishing for more than 9 months.

Hum.  ZDI is supposed to be a “best-of-breed model for rewarding security researchers for responsibly disclosing discovered vulnerabilities. ”  How is it responsible to take 9 months to fix a known but secret high severity vulnerability?  It’s not directly ZDI’s fault that the vendors are taking so long, but then it’s not providing much incentive either to the vendors.  This suggests that programs like ZDI’s have a pernicious effect.  They buy the information from researchers, who are then forbidden from disclosing the vulnerabilities.  More vulnerabilities are found due to the monetary incentive, but only people paying for protection services have any peace of mind.  The software vendors don’t care much, as the vulnerabilities remain secret.  The rest of us are worse off than before because more vulnerabilities remain secret for an unreasonable length of time.

Interestingly, this is what was predicted several years ago in “Market for Software Vulnerabilities?  Think Again” (2005) Kannan K and Telang R, Management Science 51, pp. 726-740.  The model predicted worse social consequences from these programs than no vulnerability handling at all due to races with crackers, increased vulnerability volume, and unequal protection of targets.  This makes another conclusion of the paper interesting and likely valid:  CERT/CC offering rewards to vulnerability discoverers should provide the best outcomes, because information would be shared systematically and equally.  I would add that CERT/CC is also in a good position to find out if a vulnerability is being exploited in the wild, in which case it can release an advisory and make vulnerability information public sooner.  A vendor like TippingPoint has a conflict of interest in doing so, because it decreases the value of their protection services.

I tip my hat to TippingPoint for making their pipeline information public.  However, because they provide no deadlines to vendors or incentives for responsibly patching the vulnerabilities, the very existence of their services and similar ones from other vendors are hurting those who don’t subscribe.  That’s what makes vulnerability protection services a racket. 

 

On standard configurations

[tags]monocultures, compliance, standard configurations, desktops, OMB[/tags]

Another set of news items, and another set of “nyah nyah” emails to me.  This time, the press has been covering a memo out of the OMB directing all Federal agencies to adopt a mandatory baseline configuration for Windows machines.  My correspondents have misinterpreted the import of this announcement to mean that the government is mandating a standard implementation of Windows on all Federal machines.  To the contrary, it is mandating a baseline security configuration for only those machines that are running Windows.  Other systems can still be used (and should be).

What’s the difference? Quite a bit. The OMB memo is about ensuring that a standard, secure baseline is the norm on any machine running Windows.  This is because there are so many possible configuration options that can be set (and set poorly for secure operation), and because there are so many security add-ons, it has not been uncommon for attacks to occur because of weak configurations.  As noted in the memo, the Air Force pioneered some work in decreeing security baseline configurations.  By requiring that certain minimum security configuration settings were in place on every Windows machines, there was a reduction in incidents.

From this, and other studies, including some great work at NIST to articulate useful policies, we get the OMB memo.

This is actually an excellent idea.  Unfortunately, the minimum is perhaps a bit too “minimum.”  For instance, replacing IE 6 under XP with Firefox would probably be a step up in security.  However, to support common applications and uses, the mandated configuration can only go so far without requiring lots of extra (costly) work or simply breaking things.  And if too many things get broken, people will find ways around the secure configuration—after all, they need to get their work done!  (This is often overlooked by novice managers focused on “fixing” security.)

Considering the historical problems with Linux and some other systems, and the complexity of their configuration, minimum configurations for those platforms might not be a bad idea, either.  However, they are not yet used in large enough numbers to prompt such a policy.  Any mechanism or configuration where the complexity is beyond the ken of the average user should have a set, minimum, safe configuration. 

Note my use of the term “minimum” repeatedly.  If the people in charge of enforcing this new policy prevent clueful people from setting stronger configurations, then that is a huge problem.  Furthermore, if there are no provisions for understanding when the minimum configuration might lead to weakness or problems and needs to be changed, that would also be awful.  As with any policy, implementation can be good or be terrible.

Of course, mandating the use of Windows (2000, XP, Vista or otherwise) on all desktops would not be a good idea for anyone other than Microsoft and those who know no other system.  In fact, mandating the use of ANY OS would be a bad idea.  Promoting diversity and heterogeneity is valuable for many reasons, not least of which are:

  1. limit the damage possible from attacks targeting a new or unpatched vulnerability
  2. limit the damage possible from a planted vulnerability
  3. limit the spread of automated attacks (malware)
  4. increase likelihood of detection of attacks of all kinds
  5. provide incentive in the marketplace for competition and innovation among vendors & solutions
  6. enhance capability to quickly switch to another platform in the event a vendor takes a turn harmful to local interests
  7. encourages innovation and competition in design and structure of 3rd-party solutions
  8. support agility—allow testing and use of new tools and technologies that may be developed for other platforms

These advantages are not offset by savings in training or bulk purchasing, as some people would claim.  They are 2nd order effects and difficult to measure directly, but their absence is noted….usually too late.

But what about interoperability?  That is where standards and market pressure come to bear.  If we have a heterogeneous environment, then the market should help ensure that standards are developed and adhered to so as to support different solutions.  That supports competition, which is good for the consumer and the marketplace.

And security with innovation and choice should really be the minimum configuration we all seek.

[posted with ecto]

Surprise, Microsoft Listed as Most Secure OS

It is well-known that I am a long-time user of Apple Macintosh computers, and I am very leery of Microsoft Windows and Linux because of the many security problems that continue to plague them.  (However, I use Windows, and Linux, and Solaris, and a number of other systems for some things—I believe in using the right tool for each task.)  Thus, it is perhaps no surprise that a few people have written to me with a “Nyah, nyah” message after reading a recent article claiming that Windows is the most secure OS over the last six months. However, any such attitude evidences a certain lack of knowledge of statistics, history, and the underlying Symantec report itself.  It is possible to lie with statistics—or, at the least, be significantly misled, if one is not careful.

First of all, the news article reported that —in the reporting period—Microsoft software had 12 serious vulnerabilities plus another 27 less serious vulnerabilities.  This was compared with 1 serious vulnerability in Apple software out of a total of 43 vulnerabilities.  To say that this confirms the point because there were fewer vulnerabilities reported in MS software (39 vs. 43) without noting the difference in severity is clearly misleading.  After all, there were 12 times as many severe vulnerabilities in MS software as in Apple software (and more than in some or all of the others systems, too—see the full report).

Imagine reading a report in the newspaper on crime statistics.  The report says that Portland saw one killing and 42 instances of littering, while Detroit had 27 instances of jaywalking and 12 instances of rape and homicide.  If the reporter concluded that Detroit was the safer place to live and work, would you agree?  Where do you think you would feel safer?  Where would you be safer (assuming the population sizes were similar; in reality, Portland is about 2/3 the population of Detroit)?

More from a stochastic point of view, if we assume that the identification of flaws is more or less a random process with some independence, then it is not surprising if there are intervals where the relative performance in that period does not match the overall behavior.  So, we should not jump to overall conclusions when there are one or two observational periods where one system dominates another in contrast to previous behavior.  Any critical decisions we might wish to make about quality and safety should be based on a longer baseline; in this case, the Microsoft products continue to be poor compared to some other systems, including Apple.  We might also want to factor in the size of the exposed population, the actual amount of damages and other such issues.

By analogy, imagine you are betting on horses.  One horse you have been tracking, named Redmond, has not been performing well.  In nearly every race that horse has come in at or below the middle of the pack, and often comes in last, despite being a crowd favorite.  The horse looks good, and lots of people bet on it, but it never wins.  Then, one day, in a close heat, Redmond wins!  In a solid but unexciting race, Redmond comes in ahead of multiple-race winner #2 (Cupertino) by a stride.  Some long-time bettors crow about the victory, and say they knew that Remond was the champ.  So, you have money to gamble with.  Are you going to bet on Redmond to win or place in each of the next 5 races?

Last of all, I could not find a spot in the actual Symantec report where it was stated that any one system is more secure than another—that is something stated by the reporter (Andy Patrizio) who wrote the article.  Any claim that ANY system with critical flaws is “secure” or “more secure” is an abuse of the term.  That is akin to saying that a cocktail with only one poison is more “healthful” than a cocktail with six poisons.  Both are lethal, and neither is healthful under any sane interpretation of the words.

So, in conclusion, let me note that any serious flaws reported are not a good thing, and none of the vendors listed (and there are more than simply Microsoft and Apple) should take pride in the stated results.  I also want to note that although I would not necessarily pick a MS platform for an application environment where I have a strong need for security, neither would I automatically veto it.  Properly configure and protect any system and it may be a good candidate in a medium or low threat environment. As well, the people at Microsoft are certainly devoting lots of resources to try to make their products better (although I think they are trapped by some very poor choices made in the past).

Dr. Dan Geer made a riveting and thought-provoking presentation on cyber security trends and statistics as the closing keynote address of this year’s annual CERIAS Security Symposium.  His presentation materials will shortly be linked into the symposium WWW site, and a video of his talk is here.  I recommend that you check that out as additional material, if you are interested in the topic.

This Week at CERIAS

CERIAS News

CERIAS Weblogs

CERIAS Hotlist

Blog Archive

Get Your Degree with CERIAS