Thirty-five years ago today (November 2nd), the Internet Worm program was set loose to propagate on the Internet. Noting that now to the computing public (and cybersecurity professionals, specifically) often generates an "Oh, really?" response akin to stating that November 2nd is the anniversary of the inaugural broadcast of the first BBC TV channel (1936), and the launch of Sputnik 2 with Laika aboard (1957). That is, to many, it is ho-hum, ancient history.
Perhaps that is to be expected after 35 years -- approximately the length of a human generation. (As an aside, I have been teaching at Purdue for 36 years. I have already taught students whose parents had taken one of my classes as a student; in five or so years, I may see students whose grandparents took one of my classes!). In 1988, fewer than 100,000 machines were likely connected to the Internet; thus, only a few thousand people were involved in systems administration and security. For us, the events were more profound, but we are outnumbered by today's user population; many of us have retired from the field...and more than a few have passed on. Thus, events of decades ago have become ancient history for current users.
Nonetheless, the event and its aftermath were profound for those who lived through it. No major security incident had ever occurred on such a scale before. The Worm was the top news story in international media for days. The events retold in Cliff Stoll's Cuckoo's Egg were only a few years earlier but had affected far fewer systems. However, that tale of computer espionage heightened concern by authorities in the days following the Worm's deployment regarding its origin and purpose. It seeded significant changes in law enforcement, defense funding and planning, and how we all looked at interconnectivity. In the following years, malware (and especially non-virus malware) became an increasing problem, from Code Red and Nimda to today's botnets and ransomware. All of that eventually led to a boom in add-on security measures, resulting in what is now a multi-billion dollar cybersecurity industry.
At the time of the Worm, the study of computing security (the term "cybersecurity" had not yet appeared) was primarily based around cryptography, formal verification of program correctness, and limiting covert channels. The Worm illustrated that there was a larger scope needed, although it took additional events (such as the aforementioned worms and malware) to drive the message home. Until the late 1990s, many people still believed cybersecurity was simply a matter of attentive cyber hygiene and not an independent, valid field of study. (I frequently encountered this attitude in academic circles, and was told it was present in the discussion leading to my tenure. That may seem difficult to believe today, but should not be surprising: Purdue has the oldest degree-granting CS department [60 years old this year], and it was initially viewed by some as simply glorified accounting! It is often the case that outsiders dismiss an emerging discipline as trivial or irrelevant.)
The Worm provided us with an object lesson about many issues that, unfortunately, were not heeded in full to this day. That multi-billion dollar cybersecurity industry is still failing to protect far too many of our systems. Among those lessons:
.rshrc files) created a playground for lateral movement across enterprises. We knew then that good security practice involved fully mediated access (now often referred to as "Zero Trust") and had known that for some time. However, convenience was viewed as more important than security...a problem that continues to vex us to this day. We continue to build systems that both enable effortless lateral movement, and make it difficult or annoying for users to reauthenticate, thus leading them to bypass the checks.
That last point is important as we debate the dangers and adverse side-effects of machine learning/LLM/AI systems. Those are being refined and deployed by people claiming they are not responsible for the (mis)use of (or errors in) those systems and that their economic potential outweighs any social costs. We have failed to clearly understand and internalize that not everything that can be done should be done, especially in the Internet at large. This is an issue that keeps coming up and we continue to fail to address it properly.
As a field, cybersecurity is relatively young. We have a history that arguably starts in the 1960s with the Ware Report. We are still discovering what is involved in protecting systems, data privacy, and safety. Heck, we still need a commonly accepted definition of what cybersecurity entails! (Cf. Chapter 1 of the Cybersecurity Myths book, referenced below.). The first cybersecurity degree program wasn't established until 2000 (at Purdue). We still lack useful metrics to know whether we are making significant progress and titrate investment. And we are still struggling with tools and techniques to create and maintain secure systems. All this while the market (and thus need) is expanding globally.
In that context of growth and need, we should not dismiss the past as "Ho-hum, history." Members of the military study historic battles to avoid future mistakes: mentioning the Punic Wars or The Battle of Thermopylae to such a scholar will not result in dismissal with "Not relevant." If you are interested in cybersecurity, it would be advisable to study some history of the field and think about lessons learned -- and unlearned.
Philosophically, we are not fond of the terms 'artificial intelligence' and 'machine learning,' either. Scholars do not have a good definition of intelligence and do not understand consciousness and learning. The terms have caught on as a shorthand for 'Developing algorithms and systems enhanced by repeated exposure to inputs to operate in a manner suggesting directed selection.' We fully admit that some systems seem brighter than, say, certain current members of Congress, but we would not label either as intelligent.I recommend reading this and this for some other views on this topic. (And, of course, buy and read at least one copy of Cybermyths and Misconceptions.
I have attended 14 of the last 22 RSA conferences. (I missed the last three because of COVID avoidance; many people I know who went became infected and contributed to making them superspreader events. I saw extremely few masks this year, so I will not be surprised to hear of another surge. I spent all my time on the floor and in crowds with a mask -- I hope that was sufficient.)
I have blogged here about previous iterations of the conference (2007, 2014, 2016, and most recently, 2019). Reading back over those accounts makes me realize that little has really changed. Some of the emphasis has changed, but most of what is exhibited and presented is not novel nor does it address root causes of our problems.
Each year, I treasure meeting with old friends and making some worthwhile new acquaintances with people who actually have a clue (or two). Sadly, the number of people I stop to chat with who don't have the vaguest idea about the fundamentals of the field or its history continue to constitute the majority. How can the field really progress if the technical people don't really have a clue what is actually known about security (as opposed to known about the products in their market segment)?
I was relieved to not see hype about blockchain (ugh!) or threat intelligence. Those were fads a few years ago. Apparently, hype around quantum and LLMs has not yet begun to build in this community. Zero trust and SBOM were also understated themes, thankfully. I did see more hardware-based security, some on OT, and a little more on user privacy. All were under-represented.
My comments on the 2019 RSAC could be used almost word-for-word here. Rather than do that, I strongly suggest you revisit those comments now.
Why did I go if I think it was so uninspiring? As usual, it was for people. Also, this year, I was on a panel for our recent book, Cybersecurity Myths and Misconceptions.. Obviously, I have a bias here, but I think the book addresses a lot of the problems I am noting with the conference. We had a good turnout at the panel session, which was good, but almost no one showed up at the book signings. I hope that isn't a sign that the book is being ignored, but considering it isn't hyping disaster or a particular set of products, perhaps that is what is happening. Thankfully, some of the more senior and knowledgable people in the field did come by for copies or to chat, so there is at least that. (I suggest that after you reread my 2019 comments, you get a copy of the book and think about addressing some of the real problems in the field.)
Will I go to the 2024 RSAC Conference? It depends on my health and whether I can find funds to cover the costs: It is expensive to attend, and academics don't have expense accounts. If I don't go, I will surely miss seeing some of the people who I've gotten to know and respect over the years. However, judging by how many made an effort to find me and how the industry seems to be going, I doubt will be missed if I am not there. That by itself may be enough reason to plan an alternate vacation
If you didn’t get a chance to attend S4x23 to hear the talks, or you simply haven’t heard enough from Spaf yet, here is a recording of the keynote interview with Spaf by Dale Peterson. The interview covered a lot of ground about the nature of defensive security, the new Cybermyths book (got yours yet?), OT security, the scope of security understanding, having too much information, and having a good security mindset.
This and other interviews and talks Spaf has given are on the Professor Spaf YouTube channel.
At the 25th anniversary CERIAS Symposium on March 29, we made a special awards presentation.
Unfortunately, I had lost my voice. Joel Rasmus read my remarks (included in what follows). I want to stress that these comments were heartfelt from all of us, especially me.
25 years ago, I agreed to start something new—something, unlike anything that had existed at Purdue before. I soon discovered that it was unlike any other academic center others had encountered: a multidisciplinary center built around the concept of increasing the security and safety of people by addressing problems from, and with, computing. I note that I wasn’t the only faculty member involved. Core faculty at the time were Sam Wagstaff, Mike Atallah, and Carla Brodley, then in our School of ECE. Sam and Mike have been steady contributors for more than 25 years (stretching back to the pre-CERIAS, COAST days); as an Emeritus Professor, Sam is still working with us.
I knew I needed help making the new entity succeed. My first step was hiring some great staff—Andra Nelson (now Martinez) and Steve Hare were the first two new hires; the late Marlene Walls was already working for me. Those three played a huge role in getting CERIAS running and helping with an initial strategic plan. We have recognized them in the past (and will feature them prominently in the history of CERIAS when I get around to writing it).
I quickly followed those hires by organizing an advisory board. Some of the members were personnel from the organizations that were committed to supporting us. Others were people in senior positions in various agencies and companies. And a few were friends who worked in related areas.
Those choices seem to have worked out pretty well. CERIAS grew from four involved faculty in April 1998 to (as of March 2023) 163. We went from four supporting companies and agencies to two dozen. We have thousands of alumni and worldwide recognition. There is considerable momentum for excellence and growth in the years to come.
CERIAS has benefited from the counsel, support, and leadership of scores of wonderful people from strategic partner organizations who served on the External Advisory Board over the years. However, some particularly stand out because they went above and beyond in their efforts to help CERIAS succeed. On this special occasion of our 25th anniversary, we recognize six exceptional advisors who helped CERIAS succeed and be what it is today.
(Unfortunately, due to various issues, none were present at the Symposium in person to receive the awards. This post is to share with everyone else how much we value their history with us.)
We are bestowing five silver Foundation Award Medals to these individuals:
These five people provided assistance above and beyond what we expected, and we will be forever grateful.
We had one final, special award.
Timothy Grance has been a mainstay at NIST (National Institute for Standards and Technology) for decades. You can find his name on many of the reports and standards NIST has issued and other computing and cybersecurity activities. He’s not as well known as many of our advisors because he prefers to provide quiet, steady contributions. Most importantly to CERIAS, Tim has great vision and is one of the rare people who can find ways to help others work together to solve problems. He is inspirational, thoughtful, and cares deeply about the future. These qualities have undoubtedly been useful in his job at NIST, but he brought those same skills to work for CERIAS at Purdue and even before as an advisor to COAST.
For the last 25 years, Tim was (and continues to be) an honored member of the External Advisory Board. He has attended countless board meetings and events over the years — all at his personal expense. He made introductions for us across a wide variety of institutions—academic, governmental, and commercial—and hosted some of the EAB meetings. He has always provided sage advice, great direction, and quiet support for all we have done. Despite being somewhat limited by a significant stroke a few years ago, he fought back courageously and returned to CERIAS for our Symposium and Board meeting. We reserve a chair for him even when he cannot travel to be with us.
Tim’s commitment to the field, especially to CERIAS, make him a national treasure. We are proud also to consider him a CERIAS treasure, and thus award the Gold Foundation Award Medal to Timothy Grance.
We conclude with sincere thanks, not only to these six wonderful people, but to all those who, over the years, have provided support, advice, time, equipment, funding, problem sets, and simply good cheer. That CERIAS has made it 25 years successfully and continues to grow and innovate is a testament to the importance of the problems and the willingness of such a large community to help address them. Time has only grown the problem set, but everyone associated with CERIAS is ready and willing to take them on. We all look forward to continuing our engagement with the community in doing so!