The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Reflecting on the Internet Worm at 35

Share:

Thirty-five years ago today (November 2nd), the Internet Worm program was set loose to propagate on the Internet. Noting that now to the computing public (and cybersecurity professionals, specifically) often generates an "Oh, really?" response akin to stating that November 2nd is the anniversary of the inaugural broadcast of the first BBC TV channel (1936), and the launch of Sputnik 2 with Laika aboard (1957). That is, to many, it is ho-hum, ancient history.

Perhaps that is to be expected after 35 years -- approximately the length of a human generation. (As an aside, I have been teaching at Purdue for 36 years. I have already taught students whose parents had taken one of my classes as a student; in five or so years, I may see students whose grandparents took one of my classes!). In 1988, fewer than 100,000 machines were likely connected to the Internet; thus, only a few thousand people were involved in systems administration and security. For us, the events were more profound, but we are outnumbered by today's user population; many of us have retired from the field...and more than a few have passed on. Thus, events of decades ago have become ancient history for current users.

Nonetheless, the event and its aftermath were profound for those who lived through it. No major security incident had ever occurred on such a scale before. The Worm was the top news story in international media for days. The events retold in Cliff Stoll's Cuckoo's Egg were only a few years earlier but had affected far fewer systems. However, that tale of computer espionage heightened concern by authorities in the days following the Worm's deployment regarding its origin and purpose. It seeded significant changes in law enforcement, defense funding and planning, and how we all looked at interconnectivity. In the following years, malware (and especially non-virus malware) became an increasing problem, from Code Red and Nimda to today's botnets and ransomware. All of that eventually led to a boom in add-on security measures, resulting in what is now a multi-billion dollar cybersecurity industry.

At the time of the Worm, the study of computing security (the term "cybersecurity" had not yet appeared) was primarily based around cryptography, formal verification of program correctness, and limiting covert channels. The Worm illustrated that there was a larger scope needed, although it took additional events (such as the aforementioned worms and malware) to drive the message home. Until the late 1990s, many people still believed cybersecurity was simply a matter of attentive cyber hygiene and not an independent, valid field of study. (I frequently encountered this attitude in academic circles, and was told it was present in the discussion leading to my tenure. That may seem difficult to believe today, but should not be surprising: Purdue has the oldest degree-granting CS department [60 years old this year], and it was initially viewed by some as simply glorified accounting! It is often the case that outsiders dismiss an emerging discipline as trivial or irrelevant.)

The Worm provided us with an object lesson about many issues that, unfortunately, were not heeded in full to this day. That multi-billion dollar cybersecurity industry is still failing to protect far too many of our systems. Among those lessons:

  • Interconnected systems with long-lasting access (e.g., .rshrc files) created a playground for lateral movement across enterprises. We knew then that good security practice involved fully mediated access (now often referred to as "Zero Trust") and had known that for some time. However, convenience was viewed as more important than security...a problem that continues to vex us to this day. We continue to build systems that both enable effortless lateral movement, and make it difficult or annoying for users to reauthenticate, thus leading them to bypass the checks.
  • Systems without separation of privilege facilitated the spread of malware. Current attackers who manage to penetrate key services or privileged accounts are able to gain broader access to entire networks, including the ability to shut off monitoring and updates. We have proven methods of limiting access (SELinux is one example) but they are too infrequently used.
  • Sharing information across organizations can result in a more robust, more timely response. Today, we still have organizations that refuse to disclose if they have been compromised, thus delaying our societal response; information obtained by government agencies has too often been classified, or at least closely held.. The information that is shared is frequently incomplete or not timely.
  • The use of type-unsafe languages with minimal security features can lead to flaws that may be exploited. One only needs to survey recent CVE entries and attack reports to see buffer overflows, type mismatches, and other well-known software flaws leading to compromise. Many organizations are still producing or reusing software written in C or C++ that are especially prone to such errors. Sadly, higher education is complicit by teaching those languages as primary, mainly because their graduates may not be employable without them.
  • Heterogenity of systems provides some bulwark against common attacks. Since 1988, the number of standard operating systems in use has decreased, as has the underlying machine architectures. There are clearly economic arguments for reduced numbers of platforms, but the homogeneity facilitates common attacks. Consideration of when to reuse and when to build new is sadly infrequent.
  • The Worm incident generated conflicting signals about the propriety of hacking into other people's systems and writing malware. Some people who knew the Worm's author rose to his defense, claiming he was demonstrating security problems and not doing anything wrong. Malware authors and system attackers commonly made that same claim in the decades following, with mixed responses from the community. It still colors the thinking of many in the field, justifying some very dubious behavior as somehow justified by results. Although there is nuance in some discussions, the grey areas around pen testing, companies selling spyware, and "ethical" hacking still enable plausible explanations for bad behavior.

That last point is important as we debate the dangers and adverse side-effects of machine learning/LLM/AI systems. Those are being refined and deployed by people claiming they are not responsible for the (mis)use of (or errors in) those systems and that their economic potential outweighs any social costs. We have failed to clearly understand and internalize that not everything that can be done should be done, especially in the Internet at large. This is an issue that keeps coming up and we continue to fail to address it properly.

As a field, cybersecurity is relatively young. We have a history that arguably starts in the 1960s with the Ware Report. We are still discovering what is involved in protecting systems, data privacy, and safety. Heck, we still need a commonly accepted definition of what cybersecurity entails! (Cf. Chapter 1 of the Cybersecurity Myths book, referenced below.). The first cybersecurity degree program wasn't established until 2000 (at Purdue). We still lack useful metrics to know whether we are making significant progress and titrate investment. And we are still struggling with tools and techniques to create and maintain secure systems. All this while the market (and thus need) is expanding globally.

In that context of growth and need, we should not dismiss the past as "Ho-hum, history." Members of the military study historic battles to avoid future mistakes: mentioning the Punic Wars or The Battle of Thermopylae to such a scholar will not result in dismissal with "Not relevant." If you are interested in cybersecurity, it would be advisable to study some history of the field and think about lessons learned -- and unlearned.


Further Reading

The Ware Report
This can be seen as one of the first descriptions of cybersecurity challenges, needs and approaches.
The protection of information in computer systems
A paper from 1975 by J.H. Saltzer and M.D. Schroeder. This paper refers to basic design principles, in large part inspired by Multics, that include complete mediation (now somewhat captured by "Zero Trust") and least privilege. These are most often violated by software rather than designed in, especially economy of mechanism.
(Versions of this paper may be found outside the paywall via web search engines.)
Historical papers archive
A collection of historical papers presenting the early foundation of cybersecurity. This includes the Ware Report, and its follow-on, the Anderson Report. Some other, hard-to-find items are here.
The Communications of the ACM Worm Issue
An issue of CACM was devoted to papers about the Worm.
The Internet Worm: An Analysis
My full report analyzing what the Worm program did and how it was structured.
The Internet Worm Incident
A report describing the timeline of the Worm release, spread, discovery, and response.
Happy birthday, dear viruses
This is a short article in Science I coauthored with Richard Ford for the 25th anniversary of the Worm, about malware generally.
Cybersecurity Myths and Misconceptions
A new book about things the public and even cybersecurity experts mistakenly believe about cybersecurity. Chapter 1 addresses, in depth, how we do not have an accepted definition of cybersecurity or metrics to measure it. Other items alluded to in this blog post are also addressed in the book.
Cyber security challenges and windmills
One of my blog posts, from 2009, about how we continue to generate studies of what would improve cybersecurity and then completely fail to heed them. The situation has not improved in the years since then.

Comments

Posted by Dale
on Wednesday, November 15, 2023 at 11:48 AM

The Morris Worm was an interesting problem in its day. The company I worked for at that time had no worries ... They were running BTOS/CTOS. However, Y2K was killed those OS’s off.

You never really know what tech is going to last. Like Unidata/uniVerse/Pick etc ...

My only advice to those that care to listen. If it doesn’t scale, it doesn’t matter.

Think of the security challenges when wasm becomes the operating system of choice of IOT.

Just an old gray geek - YMMV.

Posted by Eli Flanagan
on Wednesday, November 15, 2023 at 02:05 PM

Thank you creating this short and insightful post reflecting on the history of the Morris Worm virus and broadly on cybersecurity as a discipline. I find reflections on technology quite absent from where I stand in industry.

> Many organizations are still producing or reusing software written in C or C++ that are especially prone to such errors. Sadly, higher education is complicit by teaching those languages as primary, mainly because their graduates may not be employable without them

I have zilch experience in evaluating undergraduate and graduate curricula. Though I take heart that languages like Rust have garnered industry and enthusiast adoption. I’ve personally found Rust useful for experiencing how language design contributes to my improved mental model of building more secure systems.

Leave a comment

Commenting is not available in this section entry.