The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Reports and Papers Archive


Browse All Papers »       Submit A Paper »

Parallel Collision Search with Application to Hash Functions and Discrete Logarithms

Paul C. van Oorschot,Michael J. Wiener

Current techniques for collision search with feasible memory requirements involve pseudo-random walks through some space where one must wait for the result of the current step before the next step can begin. These techniques are serial in nature, and direct parallelization is inefficient. We present a simple new method of parallelization collisions that greatly extends the reach of practical attacks. The new method is illustrated with applications to hash functions and discrete logarithms in cyclic groups. In the case of hash functions, we begin with two messages; the first is a message that we want our target to digitally sign, and the second is a message that the target is willing to sign. Using collisions search adapted for hashing collisions, one can find slightly altered versions of these messages such that the two new messages give the same hash result. As a particular example, a $10 million custom machine for applying parallel collision search to the MD5 has function could complete an attack with an expected run time of 24 days. This machine would be specific to MD5, but could be used for any pair of messages. For discrete logarithms in cyclic groups, ideas from Pollard’s rho and lambda methods for index computation are combined to allow efficient parallel implementation using the new method. As a concrete example, we consider an elliptic curve cryptosystem over GF(2^155) with the order of the curve having largest prime factor of approximate size 10^36. A $10 million machine custom built for this finite field could compute a discrete logarithm with an expected run time of 36 days.

Added 2002-07-26

Fuzz Revisited: A Re-examination of the Reliability of UNIX Utilities and Services

Barton P. Miller,David Koski,Cjin Pheow lee,Vivekananda Maganty,Ravi Murthy,Ajitkumar Natarajan,Jeff Steidl

We have tested the reliability of a large collection of basic UNIX utility programs, X-Window applications and servers, and network services. We used a simple testing method of subjecting these programs to a random input stream. Our testing methods and tools are largely automatic and simple to use. We tested programs on nine versions of the UNIX operating system, including seven commercial systems and freely-available GNU untilites and Linux. We report which programs failed on which systems, and identify and categorize the causes of these failures. The results of our testing is that we can crash (with core dump) or hang (infinite loop) over 40 (in the worst case) of the basic programs and over 25 of the X-Window applications. We were not able to crash any of the network services that we tested nor any of the X-Window servers. This study parallels our 1990 study (that tested only the basic UNIX utilities); all systems that we compared between 1990 and 1995 noticeably improved in reliability, but still had significant rates of failure. The reliability of the basic utilities from GNU and Linux were noticeably better than those of the commercial systems. We also tested how utility-programs checked their return codes from the memory allocation library routines by simulating the unavailability of virtual memory. We could crash almost half of the programs that we tested in this way.

Added 2002-07-26

A Proposal for a Postgraduate Curriculum in Information Security, Dependability and Safety

Sokratis K. Katsikas,Dimitris A. Gritzalis

A proposal for a postgraduate (MSc-type) programme in Information Security, Dependability ans Safety is described in detail in this report. However, implementation issues have not been included. The programme description (syllabus) proposed is full in the acedemic sense, i.e. it includes the programme overall structure, as well as course listings, credits per course, academic prerequisites, degree prerequisites, indicative textbooks per course, list of exsisting similar courses, course timings, etc.

Added 2002-07-26

Network Law Update: February 1995

William J. Cook

The headline in the New York Times was clear, “Pirated Copies of Latest Software From IBM, Others Posted on the Internet” (NYT 10/31/94). The ramifications were simple: the market for a new computer program can be quickly destroyed if it is posted on the Internet. Most new programs will enjoy at least a six month marketing “shelf life”. But a million dollar computer program created on Monday, stolen and uploaded to the Internet on Tuesday, can be worthless by Friday. The victim-author programmer may not even know his new, “heater” program now resides on a publicly accessible, anonymous file server in France. Nevertheless, his (and your) projected six month marketing window has now shrunk to four days.

Added 2002-07-26

Computer Viruses: A Global Perspective

Steve R. White,Jeffery O. Kephart,David M. Chess

Technical accounts of computer viruses usually focus on the microscopic details of individual viruses: their stucture, their function, the type of host programs they infect, etc. The media tends to focus on the social implications of isolated scares. Such views of the virus problem are useful, but limited in scope. One of the missions of IBM’s High Integrity Computing Laboratory is to understand the virus problem from a global stand perspective, and to apply that knowledge to the developement of anti-virus technology and measures. We have employed two complementary approaches: observational and theoretical virus epidemiology. Observation of a large sample population for six years has given us a good understanding of many aspects of virus prevalence and virus trends, while our theoretical work has bolstered this understanding by suggesting some of the mechanisms that govern the behavior that we have observed.

Added 2002-07-26

Asynchronous Optimistic Rollback Recovery Using Secure Distributed Time

Sean W. Smith,David B. Johnson,J.D. Tygar

In an asynchronous distributed computation, processes may fail and restart from saved state. A protocol for “optimistic rollback recovery” must recover the sytem when other processes may depend on lost states at failed processes. Previous work has used forms of partial order clocks to track potential causality. Our research addresses two crucial short- comings: the rollback problem also involves tracking a second level of partial order time (potential knowledge of failures and rollbacks), and protocols based on partial order clocks are open to inherent security and privacy risks. We have developed a “distributed time” framework that provides the tools for multiple levels of time abstraction, and for identifying and solving the corresponding security and privacy risks. This paper applies our framework to the rollback problem. We derive a new optimistic rollback recovery protocol that provides “completely asynchronous” recovery (thus directly supporting concurrent recovery and tolerating network partitions) and that enables processes to take full advantage of their maximum potential knowledge of orphans (thus reducing the worst case bound on asynchronous recovery after a single failure from exponetial to at most one rollback per process). By explicitly tracking and utilizing both levels of partial order time, our protocol substantially improves on previous work in optimistic recovery. Our work also provides a foundation for incorporating security and privacy in optimistic rollback recovery.

Added 2002-07-26

An Evening with Berferd In Which a Cracker is Lured, Endured, and Studied

Bill Cheswick

On 7 January 1991 a cracker, believing he had discovered the famous sendmail DEBUG hole in our Internet gateway machine, attempted to obtain a copy of our password file. I sent him one. For several months we led this cracker on a merry chase in order to trace his location and learn his techniques. This paper is a chronical of the cracker’s “success” and disappointments, the bait and traps used to lure and detect him, and the chroot “Jail” we built to watch his activities. We concluded that our cracker had a lot of time and persistence, and a good list of

Added 2002-07-26

There Be Dragons

Steven M. Bellovin

Our security gateway to the Internet, research.att.com provides only a limited set of services. Most of the standard servers have been replaced by a variety of trap programs that look for attacks. Using these, we have detected a wide variety of pokes, ranging from simple attemps to log in as “guest” to forged NFS packets. We believe that many other sites are being probed but are unaware of it: the standared network daemons do not provide administrators with controls and filters or with the logging necessary to detect attacks.

Added 2002-07-26

Software Metrics and Plagiarism Detection

Geoff Whale

The reliability of plagiarism detection sytems, which try to identify similar programs in large populations, is critically dependent on the choice of program representation. Software metrics conventionally used as representations are described, and the limitations of metrics adapted from software complexity measures are outlined. An application-specific metric is proposed, one that represents the structure of a program as a variable- length profile. Its constituent terms, each recording the control structures in a program fragment, are ordered for efficient comparison. The superior performance of the plagiarism detection system based on this profile is reported, and deriving complexity measures from the profile is discussed.

Added 2002-07-26

A Programming Style Taxonomy

Paul W. Oman,Curtis R. Cook

Programming style guidelines, style analyzers, and code formatters have been developed without a solid empirical or theoretical basis. In this paper we provide: (1) a justification for developing a programming style taxonomy, (2) an operational style taxonomy, (3) example applications of the taxonomy illustrating the diverse and sometimes contradictory nature of programming style guidelines, and (4) a discussion on how the taxonomy can be used to further teaching and research in programming style. The taxonomy provides a context for understanding and identifying specific style factors and empirical studies necessary to determine the effects of style on program comprehension. The result of this paper have a direct impact on programming instruction, programming standards, automated style analyzers, and code formatting tools like pretty-printers and syntax-editors

Added 2002-07-26

A Style Analysis of C Programs

R. E. Berry,B.A.E. Meekings

A large quantity of well-respected software is tested against a series of metrics designed to measure program lucidity, with intriguing results. Although slanted toward software written in C language, the measures are adaptable for analyzing most high- level languages.

Added 2002-07-26

A Taxonomy For Programming Style

Paul W. Oman,Curtis R. Cook

Programming style guidelines, style analyzers, and code formatters have been developed without a solid empirical or theoretical basis. In this paper we provide: (1) a justification for developing a programming style taxonomy, (2) an operational style taxonomy, (3) example applications of the taxonomy illustrating the diverse and sometimes contradictory nature of programming style guidelines, and (4) a discussion on how the taxonomy can be used to further teaching and research in programming style. The taxonomy provides a context for understanding and identifying specific style factors and empirical studies necessary to determine the effects of style on program comprehension. The results of this paper have a direct impact on programming instruction, programming standards, auto- mated style analyzers, and code formatting tools like pretty- printers and syntax directed editors.

Added 2002-07-26

Typographic Style is More than Cosmetic

Paul W. Oman,Curtis R. Cook

There is disagreement about the role and importance of typographic style (source code formatting and commenting) in program comp- rehension. Results from experiments and opinions in programming style books are mixed. This article presents principals of typographic style consistent and compatible with the results of program comprehension studies. Four experiments demonstrate that the typographic style principals embodied in the book format significantly aid program comprehension and reduce maintenance effort.

Added 2002-07-26

Style: An Automated Program Style Analyzer for Pascal

Al Lake,Curtis Cook

Programming style plays an important role in program understanding and maintenance. Studies [Par83] have shown that as much as one- half of a maintenance programmer’s time is spent in activities related to understanding the program. Program understanding is also important for testing and debugging. Programming style embellishes the readability of a program and hence improves its under- standablility

Added 2002-07-26

An Empirical Study of COBOL Programs Via a Style Analyzer: The Benefits of Good Programming Style

Alan C. Benander,Barbara A. Benander

Despite its prominence as the most widely used programing language in industry, there are only a small number of publications on software metrics applied to COBOL. COBSTYLE, a one-pass, line-by line style analyzer, written in PL/I, is used to analyze 638 COBOL programs. COBSTYLE differs from other style analyzers in that it assesses penalty points for abuses in style, and considers to a larger degree, overall program structure. Mean style scores for 23 style characteristics are obtained. The data produced by COBSTYLE, together with programmer efficiency data, are stat- istically analyzed, yielding results which empirically demonstrate the benefits of good programming style. COBSTYLE, scores are shown to have statistically significant correlations with the following: (a) overall performance (as measured by students’ final course grade)- at the 0.005 significance level; (b) program correctness- at the 0.01 significance level; and (c) total debugging time - at the 0.05 significance level. An important aspect of this study is the “after-the-fact” nature of the methodology; i.e., none of the participants in this experiment were aware that the programs to be submitted to a style analyzer for analysis.

Added 2002-07-26