The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Reports and Papers Archive


Browse All Papers »       Submit A Paper »

Privacy, Surveillance and the Real ID Act

CERIAS TR 2009-19
William F. Eyre
Download: PDF

American society in the present day is grappling with issues of privacy and surveillance. These issues, the technologies involved, and implications for the organization and function of American society are examined in this dissertation.

Public Law 109-13 contains the Real ID Act, and the implementation of this act has far-reaching ramifications for Americans’ privacy. The Real ID Act, an exemplar of recentlaws regarding privacy and surveillance, serves as a basis for discussing the development of a surveillance society and its potential harm to American citizens.

The dissertation begins by framing the evolution of the concept in American society, exploring anti-terror legislation as the latest assumption of extraordinary powers by the state in times of war and national emergency, and comparing previous abridgements of enumerated Constitutional rights in such times.

It next discusses the implication of the Real ID as an insecure collection of databases, and then it examines the effect of Real ID on American citizens’ privacy as a national identification card. States have resisted the implementation of the act on the bases that the act constitutes an unfunded mandate and damages privacy.

The new surveillance system erodes personal privacy and creates a threat to privacy and autonomy from both criminals and the government, or sometimes (due to insider abuse of data) both. The dissertation details the possibility of how Real ID information access can be used against people in ways both legal and illegal, with comparisons to Great Britain; it also questions whether the government is even capable of handling increased information resources or whether such resources only provide more opportunities for improper access and misuse of personal data.

For most people, the developing surveillance state may only pose potential danger until someone is identified as a target, but its potential chilling effect threatens participatory democracy and the expression of legitimate political dissent.

The goal of this dissertation is to increase awareness of the incremental erosion of privacy rights which, once surrendered, become increasingly difficult to regain. It also aims to question some of the security assumptions that justify this erosion.

Added 2009-03-13

Information Systems Security: Requirements and Practices

National Institute of Standards and Technology
Added 2009-03-05

Fixed Vs. Variable-Length Patterns for Detecting Suspicious Process Behavior

Andreas Wespi, Herve Debar, Marc Dacier, Mehdi Nassehi
Added 2009-03-05

Myths and Realities of Cyberterrorism

Peter Flemming and Michael Stohl
Added 2009-03-05


Rule Based Detection System

Johnny Wong, Les Miller, Vasant Honavar, Guy Helmer, Amit Lamba
Added 2009-03-05

Injector: Mining Background Knowledge for Data Anonymization

CERIAS TR 2008-29
Tiancheng Li; Ninghui Li
Download: PDF

Existing work on privacy-preserving data publishing cannot satisfactorily prevent an adversary with background knowledge from learning important sensitive information. The main challenge lies in modeling the adversary’s background knowledge. We propose a novel approach to deal with such attacks. In this approach, one first mines knowledge from the data to be released and then uses the mining results as the background knowledge when anonymizing the data. The rationale of our approach is that if certain facts or background knowledge exist, they should manifest themselves in the data and we should be able to find them using data mining techniques. One intriguing aspect of our approach is that one can argue that it improves both privacy and utility at the same time, as it both protects against background knowledge attacks and better preserves the features in the data. We then present the Injector framework for data anonymization. Injector mines negative association rules from the data to be released and uses them in the anonymization process. We also develop an efficient anonymization algorithm to compute the injected tables that incorporates background knowledge. Experimental results show that Injector reduces privacy risks against background knowledge attacks while improving data utility.

Added 2009-03-04



How Much is Enough? A Risk-Management Approach to Computer Security

Stanford University-Center for International Security and Cooperation
Added 2009-02-26




Briefing Booklet

Information Security Oversight Office
Added 2009-02-26