CERIAS Blog

Page Content

Challenging Conventional Wisdom

Share:

In IT security ("cybersecurity") today, there is a powerful herd mentality. In part, this is because it is driven by an interest in shiny new things. We see this with the massive pile-on to new technologies when they gain buzzword status: e.g., threat intelligence, big data, blockchain/bitcoin, AI, zero trust. The more they are talked about, the more others think they need to be adopted, or at least considered. Startups and some vendors add to the momentum with heavy marketing of their products in that space. Vendor conferences such as the yearly RSA conference are often built around the latest buzzwords. And sadly, too few people with in-depth knowledge of computing and real security are listened to about the associated potential drawbacks. The result is usually additional complexity in the enterprise without significant new benefits — and often with other vulnerabilities, plus expenses to maintain them.

Managers are often particularly victimized by these fads as a result of long-standing deficiencies in the security space: we have no sound definition of security that encompasses desired security properties, and we, therefore, have no metrics to measure them. If a manager cannot get some numeric value or comparison of how new technology may make things better vs. its cost, the decision is often made on "best practice." Unfortunately, "best practice" is also challenging to define, especially when there is a lot of talk and excitement by people about vending the next new shiny thing. Additionally, enterprise needs are seldom identical, so “best” may not be uniform. If the additional siren call is heard about "See how it will save you money!" then it is nearly impossible to resist, even if the "savings" are only near-term or downright illusory.

This situation is complicated because so much of what we use is defective, broken, or based on improperly-understood principles. Thus, to attempt to secure it (really, to gain greater confidence in it) solutions that sprinkle magic pixie dust on top are preferred because they don't mean sacrificing the sunk cost inherent in all the machines and software already in use. Magic fairy dust is shiny, too, and usually available at a lower (initial) cost than actually fixing the underlying problems. So that is why we have containers on VMs on systems with multiple levels of hypervisor behind firewalls and IPS --and turtles all the way down — while the sunk costs keep getting larger. This is also why patching and pen testing are seen as central security practices— they are the flying buttresses of security architecture these days.

The lack of a proper definition and metrics has been known for a while. In part, the old Rainbow series from the NCSC (NSA) was about this. The authors realized the difficulty of defining "secure" and instead spoke of "trusted." The series established a set of features and levels of trust assurance in products to meet DOD needs. However, that was with a DOD notion of security at the time, so issues of resilience and availability (among others) weren't really addressed. That is one reason why the Rainbow Series was eventually deprecated: the commercial marketplace found it didn't apply to their needs.

Defining security principles is a hard problem, and is really in the grand challenge space for security research. It was actually stated as such 16 years ago in the CRA security Grand Challenges report (see #3). Defining accompanying metrics is not likely to be simple either, but we really need to do it or continue to come up against problems. If the only qualities we can reliably measure for systems are speed and cost, the decisions are going to be heavily weighted towards solutions that provide those at the expense of maintainability, security, reliability, and even correctness. Corporations and governments are heavily biased towards solutions that promise financial results in the next year (or next quarter) simply because that is easily measured and understood.

I've written and spoken about this topic before (see here and here for instance). But it has come to the forefront of my thinking over the last year, as I have been on sabbatical. Two recent issues have reinforced that:

  • I was cleaning up my computer storage and came across some old presentations from 10-20 years ago. With minor updating, they could be given today. Actually, I have been giving a slightly updated version of one from 11 years ago, and the audiences view it as "fresh." The theme? How we don't define or value security appropriately. (Let me know if you’d like me to present it to your group; you can also view a video of the talk given at Sandia National Laboratories,)
  • I was asked by people associated with a large entity with significant computing presence to provide some advice on cloud computing. They have been getting a strong push from management to move everything to the cloud, which they know to be a mistake, but their management is countering  their concerns about security with "it will cost less." I have heard this before from other places and given informal feedback to the parties involved. This time, I provided more organized feedback, now also available as a CERIAS tech report (here). In summary, moving to the cloud is not always the best idea, nor is it necessarily going to save money in the long term.

I hope to write some more on the issues around defining security and bucking the "conventional wisdom" once I am fully recovered from my sabbatical. There should be no shortage of material. In the meantime, I invite you to look at the cloud paper cited above and provide your comments below.


An Anniversary of Continuing Excellence

Share:

In February of 1997, I provided testimony to a Congressional committee about the state of cyber security education. I noted that there were only four major academic programs, with limited resources, in information security at that time. I outlined some steps that could be taken to improve our national posture in the field. Subsequently, I was involved in discussions with staffers of some Congressional committees, with staff at NSF, with National Security Council staff (notably, Richard Clarke), and people at the Department of Defense. These discussions eventually helped produce1 the Scholarship for Service program at NSF, the NSF CyberTrust program (now known as Secure and Trustworthy Cyberspace, SaTC), and the Centers of Academic Excellence program.

On 11 May 1999, 20 years ago, Purdue University 2 was recognized by the NSA as one of the initial Centers of Academic Excellence (CAE).3 There were some notable advocates of enhanced cyber security at each institution, and they had taken steps to institute courses and research to improve the field—notably including Corey Schou (recently inducted into the Cybersecurity Hall of Fame), Matt Bishop, Deborah Frincke, and Doug Jacobson, to name a few.4 As I recall, Dick Clarke was one of the prime movers to get the CAE program established under PDD-63; Dr, Vic Maconachy (then) at NSA became the director of the CAE program.

Over the years, the CAE program has continued to expand, to now encompass several hundred institutions around the US. DHS has become involved as a co-sponsor with the NSA. The main certification has bifurcated into a designation for cyber defense research (CAE-R) and a designation for cyber defense education (CAE-CDE). There ia also a designation for Centers of Academic Excellence in Cyber Operations. The NSA, as a member of the US intelligence community (IC) also helps support a program for IC Centers of Academic Excellence. In addition to the formal external evaluation process to be designated as a CAE, the program has resulted in creation of curricular guidelines and recommended best practices for educational programs. A number of leaders in education in the field have also grown out of this process, creating various resources for the community (some of which are hosted at the CLARK website for public use).

I have been critical of the overall CAE program in the past (cf. here and here). I believe most of the criticisms I made are still valid, particularly the ones concerning the designation of "excellence" and the burden of the application process. Nonetheless, there is no denying that the listed insitutitions have made strides to improve and standardize their programs towards much-needed common goals. There is also continuing (and growing) synergy with efforts such as the NIST National Initiative for Cybersecurity Education (NICE) program and the National Colloquium on Information Systems Security Education (NISSE). Additionally, there has been real progress towards establishing standardized undergraduate curricula in the field, which now includes the potential for ABET accreditation.

Those of us at Purdue recently received notice that Purdue has been recertified as a CAE-R through 2024. This is a result—in large part—of efforts by Dr. James Lerums , one of our recent Ph.D. grads. He volunteered his time to sift through all the documentsation, gathered the necessary information, and completed the application process. It was a significant effort and kudos to Jim for taking it on soon after completing a Ph.D. dissertation!

Despite some of my "grumpy old dude" criticisms, I am glad to see Purdue continue to be recognized for the continued excellence of its programs. CERIAS continues to be a focal point for the "R" aspect of the CAE-R as Purdue's designated research institute in the field: that's the "R" in CERIAS. However, it has also been Purdue's center for education for most of its existence: the "E" in CERIAS is for Education. That history includes the establishment of the first designated degree in information security in 2000, still offered as an interdisciplinary MS and PhD (which is the program Jim Lerums completed, btw).

As for the CAE program itself, and for the 5 (out of 6) other programs receiving that initial CAE designation that are still listed as CAEs, congratulations: we've come a long way, but there is still a long way to go!



Footnotes

  1. I always note that I cannot claim sole or primary credit for these initiatives; nonetheless, I was the first to publicly advocate for programs such as these, and was involved in the many of the discussions. Dick Clarke deserves a good deal of credit for his active advocacy for the area at the time, as does Lt. General (ret.) Ken Minihan (also a recent CSHOF inductee) for his support.
  2. Via CERIAS, one year old at the time.
  3. Also in that group were James Madison University, George Mason University, Idaho State University, Iowa State University, the University of California at Davis, and the University of Idaho.
  4. My apologies to others whose names I omitted.

Cyber Security Hall of Fame 2019 Inductees

Share:

[Update May 1: The CSHOF pages have been updated with bios on the 2019 inductees.{

The 2019 inductees into the CSHOF were announced last week.

The Hall of Fame was created as a way to honor and memorialize individuals who have had a particularly notable impact on cyber security as a field.

There are five criteria when considering potential honorees: Technology, Policy, Public Awareness, Education and Business. Nominated individuals can reside and work anywhere in the United States. A senior board reviews all submissions made to a public call for nominations after they have been compiled and ranked to select each year’s honorees.

This years’s honorees are:

  • Brian Snow , a former technical director with the National Security Agency, known for his work in cryptography, and in seeking to bridge the gap between government and the commercial sector, helping to foster transparency and cooperation.
  • Sheila Brand, the defining person behind the production of the Trusted Computer System Evaluation Criteria, known as the “Orange Book”, a standard set of requirements for assessing and effecting cybersecurity controls in a computer system.
  • Corey Schou , a University Professor, Associate Dean, published author, and the director of the National Information Assurance Training and Education Center. He led the development of the college curricula which underpins the Centers of Academic Excellence in Information Systems Security (Cybersecurity) program
  • Virgil Gligor , a professor at Carnegie Mellon University who has made fundamental contributions in applied cryptography, distributed systems, and cybersecurity. Other subjects that he has worked on include covert channel analysis, access control mechanisms, intrusion detection, and DoS Protection.
  • Ken Minihan , a former United States Air Force officer known for his service as the Director of the NSA, and prior to that, the Director of the Defense Intelligence Agency under the Clinton administration. He operationalized NSA’s Information Systems Security mission promoting engagement with industry and academia and U.S. allies.

Inducted in memorium:

  • Rebecca “Becky” Bace , who was the major force behind building the computer misuse and detection (CMAD) community, starting in the 1990s.She has been widely recognized for her many efforts in industry and government to increase participation by women and minorities in cyber security, and to bring useful technology to market. Becky is fondly remembered by many as the “Den mother of Cyber Security.”
  • Howard Schmidt , who, among many other activities, was Vice Chair of the President's Critical Infrastructure Protection Board and the special adviser for cyberspace security for the Bush White House directly following 9/11. While in that position, he was a leader in developing the U.S. National Strategy to Secure CyberSpace. He later served in the Obama administration as the White House Cybersecurity Coordinator. He also served as the CISO and CSO of Microsoft, among other roles.

The formal induction will be held at an event on April 25, at the annual Hall of Fame Dinner at the Hotel at Arundel Preserve.

Congratulations to all the new inductees!

March Sadness

Share:

March is a month of changes. We see winter beginning to recede (we hope!) and spring begins to show. The vernal equinox is around March 20 and heralds a return to more light than dark.

March is the month I was born (or hatched, depending on your mythos). My wonderful sister was also born in March. So were several dear friends. March is a month of beginnings.

Unfortunately, March is also a month of endings. Two years ago, I blogged about the untimely passing of three security pioneers, all good friends of mine: Kevin Ziese, Howard Schmidt, and Becky Bace. As I noted, Becky’s passing was a particularly cruel shock, as her death unexpectedly occurred only a few days after spending time with her.

I was reminded of the three of them (and my friends Ken and Wyatt and Gene, to name a few more) as I attended this year’s RSA Conference. As I walked the exhibit floor, I had a sense that I might look up and see one of them, as I did nearly every year, walking between sessions or stopping at booths. We’d compare notes about what we thought was particularly good or particularly awful — our comparisons were usually fairly well in sync. We would have had a lot to compare this year!

I don’t mean to be maudlin; I long ago did my grieving. Plus, I still have too many things to do, including burning the rest of my sabbatical, and getting some papers finished. However, I am reminded that the friends and families of those dear friends set up memorials for them. Rather than having spent the money attending this year's RSAC, I wish I had put those funds into these worthwhile causes, to which I normally contribute each year.

If you remember any of them, below are reminders of how you can do some good in their memories, and maybe help bring a little springlike cheer to others. And if you don’t remember them, maybe you should investigate a little — too many people working in cyber security have no grasp of the rich history of the field.



BTW, and on another topic entirely, I hope to see some of you at the 20th annual CERIAS Symposium in early April. It’s a great transition into spring, and a wonderful celebration of education and research. As the emeritus director, I don’t have anything to do this year other than mingle and enjoy the presentations. That’s some change after 20 years! Please consider mingling along with me, and enjoying the hospitality of the great group at CERIAS!


Kevin

If you want to make a donation in his memory, please send it to one or more of:

Howard

If you wish to make a donation in the memory of Howard Schmidt, send it to:

Brain Tumor Research Program
℅ Dr. Connelly
9200 W. Wisconsin Ave
Milwaukee, WI 53226

Becky

ACSA's top scholarship in the Scholarship for Women Studying Information Security (SWSIS.org) has been renamed as the Rebecca Gurley Bace Scholarship. Contributions to help support this scholarship are welcomed by sending a check (sorry, no online contributions) to:

Applied Computer Security Associates, Inc
2906 Covington Road
Silver Spring, MD 20910

Checks should be made payable to Applied Computer Security Associates, and note SWSIS Rebecca Gurley Bace Scholarship on the memo line.

Ken

Ken's family has indicated that memorial contributions may be given to the American Heart Association.

Gene

The ISSA Foundation has a scholarship fund in Gene's honor. Donate to:

E, Eugene Schultz Scholarship Fund
c/o Steve Haydostian
President, ISSAEF
18770 Maplewood Lane
Porter Ranch, CA 91326




All of the above are non-profit, charitable organizations, and your contributions will likely be tax-deductible, depending on your tax circumstances.

The RSA 2019 Conference

Share:

I have now attended 13 of the last 18 RSA Conferences (see some of my comments for 2016, 2015, and 2014). Before there were RSA conferences, there were the Joint National Computer Security Conferences, and I went to those, too. I’ve been going to these conferences for about 30 years now.

As I’ve noted from previous years, the deep content simply isn’t here. I no longer attend to learn about anything new and innovative — if I encounter such a thing, I view it as a pleasant surprise. Instead, this is basically a time and place where I can catch up with many friends and former students, see some industry trends, and maybe score a few new T-shirts. It also is a good intro to my spring workout schedule — I do about 20 miles of walking over 5 days, and I don’t eat many full meals.

Here are some of my random takes on this year’s conference:

The Program

  • The program is far too full, with all sorts of concurrent workshops and sessions. Most of them are simply people spouting obvious maxims and recounting basics as seen through the lens of the company they represent. It is difficult to pick out ahead of time the ones that aren’t really a waste of time if you know something about the field already.
  • Major talks seem to fall into two categories: executives speaking in slots their companies paid for, and “celebrities” who end up speaking nearly every year. Some of the latter are quite talented, but there is a déjà vu element at play.
  • Most of what is presented in sessions would not be a surprise to my students (at least, not the ones who stayed awake in class). I ran into about 15 former students here, and some basically repeated that to me. Apparently, there is a demand for being told unsurprising, basic information at conferences.

The Exhibitors

The Moscone Center was packed again. It took well over 2 days to walk all the booths, asking questions at some and skipping others. Overall, I was not impressed.

  • Once again it seemed that about 20% of the booths were new companies we had not seen before…and may not see again. For many new starts, the VC money is spent to create a booth here, and if the company doesn’t catch a certain level of notoriety (and sales), it may not exist a full year.
  • Many more non-US companies were exhibiting here this year. I recognized players from the UK, Canada, China, Taiwan, Germany, Korea, and the Netherlands.
  • The consolidation trend is more obvious: M&A activity has been integrating smaller companies into bigger players to provide more of a “full suite” solution to customers. Bigger companies tend not to take risks to innovate internally anymore. Instead, they let small companies do the innovation, and if they survive, they get gobbled up.
  • No apparent buzzword trend this year. Big data and threat intelligence were prominent a year ago. I was afraid that this year I would be overwhelmed with some combination of “blockchain,” “AI,” and “data science.” Thankfully, that didn’t happen. Maybe next year?
  • Over half the booths had no words or diagrams on the walls to indicate what the company actually did or why I would want to stop to talk to the people there. A majority made claims such as “leader,” “complete,” “new” and other such adjectives that were clearly false or unverifiable.
  • Conference management has been good about keeping the vendors from employing “booth babes” (see my links to the 2014/2015 conferences, above). To bring people into the booths, the leading contenders seemed to be participatory video games, contests to win drones, and people in white lab coats. One vendor was even raffling off a car. If the companies did a better job conveying what they were doing, perhaps they wouldn’t need these gimmicks?
  • Sideshow-style 15 minute, loud presentations in big booths were more prevalent — and still obnoxious. Several of these presented a traffic hazard when trying to walk by them.
  • At some locations the personnel were especially obnoxious about trying to scan every badge of every person even walking in the aisle. Most were polite, however, and a few were even friendly. I enjoyed talking with many people.
  • Socks seem to have replaced T-shirts as the predominant clothing giveaway. There were still some good shirts to be obtained, however. One vendor rep was joking that next year it will be branded underwear.
  • I got the sense budgets were leaner at many companies — fewer people, fewer giveaways.
  • I noted two companies had commitments to donate to non-profits when people visited their booths: Tripwire and Tinfoil Security. Kudos to them. I’d definitely rather have that than a fidget spinner or a box of mints.

More generally

I had a few people recognize me and say hello. That happens less each year. I am not so vain that I expect people to recognize me, but I do feel somewhat the dinosaur to be wandering the aisles when people don’t know my name even with prompting. My wife (who wandered the floor with me) found it particularly amusing when they tried to argue security concepts with me, or teach me history. One fun example was when a couple of people tried to explain the history and operation of the Internet Worm to me. Another fun time was had at a booth when a sales guy tried to dismiss my comments about his product with my “The only secure computer is one encased in concrete…” meme without knowing it was my original quote or who I was; I first uttered that years before he was born! (See #8 here.) He was annoyed I started laughing.

Despite GDPR coming into force in the EU (and the rest of the world, for large companies), privacy was hardly mentioned at any booth. Apparently, that isn’t of interest to this crowd.

There were some really questionable decorations. One booth was highly illuminated in bright green light. It actually made me feel a little nauseous; what were they thinking? Others had bright flashing lights (distracting, annoying, and probably a trigger for people with migraines or epilepsy). Word salad was the norm on too many booths. Few appeared to be accessible to the mobility impaired, although I only saw 3 such people in the floor in 3 days.

I saw a few vendors who effectively claimed they supported customers keeping longer audit logs that could be examined to find evidence once a breach was discovered. Think about that — the assumption is that assembled products can’t protect an enterprise well enough, or respond quickly, so that a months-long record is needed to find out when and why the failure occurred. Furthermore, that idea is normalized enough that there are companies that can sell products & services around it. Crazy.

There seem to be more advertised products/services around metrics. They don’t agree with each other on what they should be measuring or how they do it, but they claim to measure “security.” In many cases, I conjecture throwing dice would be cheaper and about as useful.

I was disappointed by the expertise and horizons of some of these people. I talked to the “CTO” at more than a half-dozen of the vendors, and their knowledge of some basic terms and history seems to reach back only about 5-6 years. This contributed to the claims of “brand new!” for several of them — they had no idea what was done before. (This is a problem rampant in academia, too — if something occurred before Google was able to index it, it never happened, apparently.) After failing to find any reasonably-aware person in my first half-dozen attempts, I stopped looking.

Sadly, the lack of foundations for the people at most of the booths mirrored the lack of a solid foundation for the products. There are some good, useful products and services present on the market. But the vast majority are intended to apply bandaids (or another layer of virtualization) on top of broken software and hardware that was never adequately designed for security. Each time one of those bandaids fails, another company springs up to slap another on over the top. That then leads to acquisition and integration into security suites. No one is really attacking the poor underlying assumptions and broken architectures. (See my last two blog posts here for more on this: here and here.) This is related to why I don’t submit proposals to talk at the conference — I tried a few years ago and the message conveyed to me was that it was out of step with what the sponsors wanted presented. The industry is primarily based on selling the illusion that vendors' products can — in the right combination and with enough money spent — completely protect target systems. Someone pointing out that this is fundamentally flawed is not a welcome addition. I get that a lot — it is probably why I don’t get asked to be a company advisor, either. People would rather believe they can find a unicorn to grant them immortality rather than hear the dreary truth that they will die someday, and probably sooner than they expect. Instead of hearing that, let there be bread and circuses!

I am giving serious thought to this being my last RSA Conference — the expense is getting to be too great for value received. The years have accumulated and I find myself increasingly out of step here. I want to do what is right — safe, secure, ensuring privacy — but so much of this industry is built around the idea that “right” means creating a startup and retiring rich in 5 years after an M&A event. I don’t believe that having piles of money is how to measure what is right. I will never retire rich; actually, because I will never be rich, I probably can’t afford to retire! I am also saddened by the lack of even basic awareness of what so many people worked so hard to accomplish as foundations for others to build on. We have a rich history as a field, and a great deal of knowledge. It is sad to see that so much of it is forgotten and ignored.

Oh, and I wish those damn kids would stay off my lawn.