Poster Competition Winners

Congratulations to the winners of the 7th Annual Information Security Symposium Poster Competition! We are proud to be associated with the following students whose posters best reflect the intellect and talent CERIAS fosters within its research and education programs.

The judges, all members of CERIAS' External Advisory Board, lauded the high level of quality in the posters. It took multiple rounds of evaluation to come up with the following top three.

 

Assurable Software and Architectures

Cryptology and Rights Management

Enclave and Network Security

Identification, Authentication and Privacy

Incident Detection, Response, and Investigation

Risk Management, Policies and Laws

Security Awareness, Education and Training

Trusted Social and Human Interactions

 

A Dynamic Publish-subscribe Overlay Network

Enclave and Network Security
Yunhua Koglin, Elisa Bertino
Previous research on publish-subscribe overlay services mainly focuses on static networks where connectivity is reliable. However, such assumption does not hold for large-scale publish-subscribe overlay services in which brokers often join or leave the system, or the connectivity is not reliable. An approach to this problem requires a system able to tolerate frequent topological reconfigurations. Besides enhancing availability, such an approach must meet other security requirements, namely confidentiality and integrity. In the paper, we propose such an approach. In particular, we propose a secure publish-subscribe overlay network which tolerates frequent topological reconfigurations. Instead of routing events on a tree-based overlay network which involves complex computation when used for large-scale, dynamic environments, our approach uses an undirected graph-based overlay network and we propose a scheme for propagating the routing information in such network. This routing scheme is desirable under dynamic environment as it only requires local information. To enhance the security and scalability of our system, we propose a two-way certificate verification scheme that uses certificate-based encryption (CBE). Another important advantage of our approach is that it enhances the flexibility of the publication process in that a publisher does not need to rely on a fixed broker and can dynamically change brokers. Besides enforcing security, our approach provides a better accounting service which is another important requirement for such systems.

A Policy Engineering Framework for Federated Access Management

Assurable Software and Architectures
Rafae Bhatti, Elisa Bertino, Arif Ghafoor
Federated systems require access management policies that not only protect user privacy and resource security but also allow scalable and seamless interoperation Current solutions to distributed access control generally fail to simultaneously address both dimensions of the problem. This research has been aimed at developing a policy-engineering framework, called xFederate, for specification and enforcement of access management policies in federated systems. The framework comprises of an access control language specification that is an extension of the well-accepted Role Based Access Control (RBAC) standard. The language extends RBAC to incorporate various essential features for federated access management. Additionally, it incorporates well-known principles of software engineering in the process of policy design to allow the development of modular and flexible policy definitions. The work also includes the design of an administrative model targeted at access control policy administration in a decentralized environment. xFederate has been applied to provide support for design and administration of access management policies in many practical situations, such as Web services, multi-domain enterprises, federated libraries, and health care services.

AC-Framework for Privacy-Preserving Collaboration

Trusted Social and Human Interactions
Wei Jiang, Chris Clifton
The secure multi-party computation (SMC) model provides means for balancing the use and confidentiality of distributed data. Increasing security concerns have led to a surge in work on practical secure multi-party computation protocols. However, most are only proven secure under the semi-honest model, and security under this adversary model is insufficient for most applications. In this poster, we present a novel framework: accountable-computing (AC-framework), that is sufficient or practical for many applications without the complexity and cost of a SMC-protocol under the malicious model.

ADEPTS: Automated Adaptive Intrusion Response

Assurable Software and Architectures
Yu-Sung Wu, Bingrui Foo, Matthew Glause, Saurabh Bagchi, Eugene Spafford

Large scale distributed systems typically have interactions among different services that create an avenue for propagation of a failure from one service to another. The failures being considered may be the result of natural failures or malicious activity, collectively called disruptions. To make these systems tolerant to failures it is necessary to contain the spread of the occurrence automatically once it is detected. The objective is to allow certain parts of the system to continue to provide partial functionality in the system in the face of failures. Real world situations impose several constraints on the design of such a disruption tolerant system of which we consider the following - the alarms may have type I or type II errors; it may not be possible to change the service itself even though the interaction may be changed; attacks may use steps that are not anticipated a priori; and there may be bursts of concurrent alarms.

We present the design and implementation of a system named ADEPTS as the realization of such a disruption tolerant system. ADEPTS uses a directed graph representation to model the spread of the failure through the system, presents algorithms for determining appropriate responses and monitoring their effectiveness, and quantifies the effect of disruptions through a high level survivability metric. ADEPTS is demonstrated on a real e-commerce testbed with actual attack patterns injected into it.

An Efficient Time Bound Hierarchical Key Management Scheme for Secure Broadcasting of XML Documents

Cryptology and Rights Management
Elisa Bertino, Ning Shang, Samuel S. Wagstaff, Jr.
A time-bound key management scheme for secure broadcasting of XML documents was proposed by E. Bertino, et al., in 2002, in which a method due to Tzeng was suggested. However this method was found insecure in 2004. We propose a new key assignment scheme for access control which is both efficient and secure.

Anti-forensics: The Coming Wave in Digital Forensics

Incident Detection, Response, and Investigation
Marc Rogers
The current field of digital forensic science finds itself in an "arms race" with the underground/criminal element. As we strive to educate more people on the processes and theories of digital forensics, the "other side" is using this information as intelligence in order to disrupt the digital forensic investigative process. The current poster introduces the concept of "Anti-forensics" and briefly examines this evolving phenomena.

Assessment of Safety Processes in the Community Hospital Setting

Security Awareness, Education and Training
James G. Anderson, PhD, Ranga Ramanujam, PhD, Kathleen Abrahamson, RN, MA, Trisha Palmer, Ganesh Kak
Patient safety is a critical issue in health care. The Institute of Medicine (2000) estimates up to 98,000 dealths occur annually as a result of medical error. Medical errors are often preventable occurrences, with the most serious errors being the most preventable (Bates et al, 1995). Health care has traditionally taken an approach to error which assigns blame to involved individuals, as opposed to recognizing potentially dysfunctional systems of operation (Liang, 2002). The current project seeks to identify and improve processes for tracking errors and near misses in the Community Hospital setting. Through interviews with patient care personnel, a survey of nursing staff, and a comprehensive evaluation of past incidents, we hope to illuminate areas where safety and efficiency can be improved. The project aims to institute systems changes to increase knowledge and information flow regarding safety incidents, thus improving both the quality of institutional safety information and perceptions and attitudes regarding patient safety among hospital staff.

Automation and Realistic Topology Generation for Routing Experiments.

Enclave and Network Security
Sonia Fahmy, Ness Shroff, David Bettis, Roman Chertov, Abdallah Khreish, Pankaj Kumar
Large scale routing experiments on an emulation testbed require topology generation, extensive router configuration and automated node control. Hence, it is important to have an infrastructure needed for fast experiment creation and automation when studying the indirect side-effects on routing, during DDoS attacks.

Biometric Credentialing for Natural Disasters

Identification, Authentication and Privacy
S. J. Elliott, Ph. D., M. Niang
The catastrophic chain of events that transpired during Hurricane Katrina concerning the humanitarian response brought to light the need for a mobile credentialing system. This poster provides a methodology for using biometric technology for credentialing individuals in natural disasters or terrorist attacks using iris recognition.

Biometrics and E-Authentication

Identification, Authentication and Privacy
Matthew R. Young, Shimon K. Modi, Stephen J. Elliott, Ph.D.
This paper outlines work being done in the Biometric Standards, Performance, and Assurance Laboratory in the field of Biometrics and E-Authentication. It examines the work being done by the biometric community with regard to remote authentication using biometrics. Some issues being discussed include Biometric System Architectures, Biometric System Threat Model, Biometric System Security Issues, and Biometric Identifier Revocation.

Conformance Testing of Access Control Systems that Employ Temporal RBAC Policies

Risk Management, Policies and Laws
Ammar Masood, Arif Ghafoor, Aditya Mathur
In order to control the time-sensitive activities present in various applications like work flow management systems and real time databases access control systems are required to be augmented with temporal constraints. One example of such constraints would be to restrict a user's ability to activate a role for a pre-determined duration. The ability of the underlying access control mechanisms to accurately enforce these constraints depends on the implementation conformance with the specifications and absence of any violations in the implementation. It therefore becomes essential to assure that the underlying implementation realizes the given access control policy completely and has no additional unspecified functionality. We have proposed a model-based approach for testing of access control systems without temporal constraints which has been determined to be quite effective in detecting program faults; therefore we plan to extend that approach for systems with temporal constraints. Specifically our focus is to devise a model-based technique for conformance testing of access control systems which employ temporal Role Based Access Control (RBAC) policies.

Course Mentor: Instructor Resource Materials for Security Courses

Security Awareness, Education and Training
Melissa Dark, Connie Justice, Linda Morales
Course Mentor is a methodology for developing instructional materials and associated teaching resources

Critical Anthropometric & Ergonomic Elements for Reliable Hand Placement in Hand Geometry Based Auth

Identification, Authentication and Privacy
Eric P. Kukula, Stephen J. Elliott, Ph.D.
The goal of this research is to provide a new, ergonomically designed biometric device that includes an examination on critical anthropometric and ergonomic elements to improve hand placement and resulting performance for a hand geometry device. This area of research called Human Biometric Sensor Interaction (HBSI) is a new topic of interest in the biometric community. To date however, a literature search has yielded little work in the area related to human factors with regard to biometric device design. Thus, developing a system that integrates ergonomics into research and development of a biometric system would provide the community with an example to adapt other biometric devices by including ergonomics in the development. This work utilizes mixed methods; incorporating qualitative and quantitative research to meet this need. The development of the device will combine critical anthropometric elements, as well as surveys, interviews, and focus groups of four groups including: technical and ergonomic experts, hand geometry users, and the elderly, those suffering from musculoskeletal disorders, and other disabilities. The output of the qualitative analysis will be used to create a prototype device, which will be tested in a comparative study of existing systems against the ergonomically developed device to determine if improvements are realized.

Digital Forensics Learning Objects

Security Awareness, Education and Training
Nathan Bingham, Melissa Dark, Sam Liles, Rick Mislan, Mard Rogers, Matt Rose, Tim Wedge
This poster presents the goals, process, and deliverables of an educational project designed to educate college students and law enforcement in basic digital forensics principles.

Distributed Detection, Identification, and Tracking in Sensor-cyber Networks

Enclave and Network Security
Yu Dong
SThe sensor-cyber networks represent a natural integration of sensor networks that monitor physical space, and cyber networks that provide access to information and computational resources in virtual space. In national defense, there is a need for a near real-time incident management system for the detection, identification, and tracking (DIT) of plumes under different attack scenarios. In a number of the DIT tasks, it has become crucial to utilize the multiple sensing modalities and information processing capabilities provided by sensor-cyber networks. The sensor networks provide information about the physical location, proximity, and movements of targets, and the cyber networks provide data and computational resources to align and fuse the sensor information for analysis and decision making. The class of problems represents a significant challenge by requiring approaches that extend beyond the conventional disciplinary boundaries between the types of network. There is thus an immediate need to develop comprehensive foundational methods and powerful testbeds to systematically solve DIT problems in sensor-cyber networks. As part of a multi-university partnership funded by the Oak Ridge National Laboratory, and under the aegis of the national SensorNet initiative, the Laboratory for Advanced Networking Systems at Purdue is investigating system support for scalable, real-time, and robust performance in DIT sensor-cyber networks.

Dynamic Cryptographic Hash Functions

Cryptology and Rights Management
William R. Speirs II
This poster introduces a new type of cryptographic hash function, the dynamic cryptographic hash function, where the size of the digest can be selected.

Emulation vs. Simulation: A Case Study with DoS Attacks

Enclave and Network Security
Sonia Fahmy, Ness B. Shroff, Roman Chertov
A lot of research is done on simulators such as ns-2 and SSFNet. Simulators, however, cannot execute real applications, and only approximate various appliances. Emulation provides a way to use real appliances and applications, but is constrained by the number of nodes, types of appliances, and difficulty in configuration. Therefore, it is imperative to accurately compare the two, so that the strengths of both approaches can be harnessed.

Enabling Confidentiality of Data Delivery in an Overlay Broadcasting System

Enclave and Network Security
Ruben Torres, Xin Sun, Aaron Walters, Cristina Nita-Rotaru, Sanjay Rao
We present our experience enabling confidentiality of video delivery in an operational overlay multicast system by incorporating key management algorithms. While much past research on key management has focused on IP Multicast, we focus on new opportunities and issues that arise in the context of overlays. We have implemented key management algorithms in an operational overlay multicast system, and conducted a detailed performance evaluation using the Planetlab testbed. Our study is conducted using real traces of join/leave dynamics obtained from operational deployment of an overlay broadcasting system. Our results indicate that: (i) for moderate sized groups, it is feasible to combine existing key management algorithms and overlay dissemination structures to achieve confidentiality while maintaining good performance and incurring low average overheads; (ii) leveraging TCP in each hop of overlay dissemination structure can significantly simplify reliable key dissemination, and the performance can be enhanced if convergence properties of overlays are considered; and (iii) peak overheads with the system can be reduced and scalability enhanced by combining knowledge of the duration the node stayed in the group with key management algorithms, and with a modest weakening of security properties.

Evaluating the Organizational Process of Securing Information Assets from the Threat of Cyberattacks or Cyberterrorist Events: An Exploratory Study

Security Awareness, Education and Training
Evalyn Henderson
A phenomenon resulting from the Information Age trend of networking is that the world is becoming increasingly interlinked. Because of all this networking, organizations are becoming increasingly more vulnerable targets to potential cyberterrorists attacks. An organization could pass the effects of cyberattacks or cyberterrorist events to any and all members connected to its network. Thus, there is a need to study the process that organizations face for securing their information assets in an environment where an increasing number of sophisticated and coordinated cyberattacks are probable. Current research tends to study cyberterrorism from a technical or behavioral perspective, with a preponderance of research leaning heavily toward the technical perspective. Within the existing literature, the development of constructs, which are related to managerial issues within an environ where organizational information assets are threatened by cyberattacks or cyberterrorist events, has not been extensively reported. Thus, the overall goal of this investigation is to enhance the understanding of the process of safeguarding critical organizational information assets against cyberattatcks and cyberterrorism. The research design involved (1) developing an initial model (i.e. initial mapping of the informal process of securing organizational information assets), (2) developing constructs, (3) testing and validating the operationalization of constructs for the study, and (3) developing and testing a structural model for evaluating whether or not the data substantiates the hypothesized process. Students were used as a pilot study group. Information security practitioners were used as the field study group. Testing and verifying the operationalization of the a priori hypothesized constructs required using exploratory factor analysis. Structural equation modeling will be used to empirically test the theoretical model developed. Each of the nine constructs (i.e. asset evaluation, risk assessment, firm profitability, productivity, security budget, security initiative programs/countermeasures, role of government, awareness and (impact upon) firm profitability) exhibited convergent validity, as well as discriminant validity. This investigation may, eventually, provide a first step in empirically testing proposed changes in a process and proposing/building theories for executing said changes, once the structural equation modeling phase has been implemented on a diverse set of data. Currently, this investigation provides a first step in extending the knowledge of security practitioners' understanding of cyberterrorism, their perceptions regarding the impact of cyberattacks or cyberterrorism and their perceptions regarding safeguards implemented within organizations, in the form of construct development.

Examining the Usability of Web Privacy Policies

Risk Management, Policies and Laws
Robert W. Proctor, M. Athar Ali, Kim-Phuong L. Vu
Research was conducted to understand what types of information are included in privacy policies and obtain metrics of users' comprehension of privacy policies and attitudes toward the policies and their host sites. Privacy policies were evaluated for their usability using several methods. It was found that privacy policies need to provide more protection goals. Privacy policies should also define the different categories of information like Personally Identified Information (PII), non-PII. It was found that privacy policies use such terms abundantly but most users are unaware of such categorization. Privacy policies should define these categories and what information would be considered as part of a category.

Exploiting Security Punctuations to Enforce Security and Preserve Privacy in Data Stream Management

Assurable Software and Architectures
Rimma V. Nehme, Elke A. Rundensteiner, Elisa Bertino
So far, privacy and security in the context of the streaming systems has largerly been overlooked. We now tackle this important problem. Our work focuses on context-aware security and user-centric privacy preservation in data streaming systems by exploiting security constraints that are dynamically embedded into data streams. These constraints are called security punctuations. Specifically, we describe how fine-grained access control and delegation can be achieved in DSMS using the security punctuations, and how privacy can be preserved using our proposed solution. We propose novel query operators, termed Security Shield (SS) operators. An SS determines whether a data tuple is allowed to be forwarded further in the query plan based on the security privileges of the queries and the access control policy of the tuple. We provide an algorithm for shared query plan generation where queries may have different security restrictions, but may share common query subplans. We implement the security punctuation framework within a real DSMS. Our experimental results show that our proposed solution incurs low overhead.

File Hound: A Forensics Tool for First Responders

Incident Detection, Response, and Investigation
Wm. Blair Gillam, Marcus K. Rogers
Since the National Institute of Justice (NIJ) released their Electronic Crime Needs Assessment for State and Local Law Enforcement study results in 2001, several critical strides have been made in improving the tools and training that state and local law enforcement organizations have access to. One area that has not received much attention is the computer crime first responder. File Hound, a "field analysis" software program for law enforcement first responders developed at Purdue University, is currently used internationally by over 40 law enforcement agencies. It has been successfully used in several cases ranging from child pornography to fraud.

FREEAK - Forensic Rapid Evidence Extraction Analysis Kit

Incident Detection, Response, and Investigation
Rick Mislan, Kyle Lutes, Amber Schroader, Karl Dunnagan
Given the remarkable number of advancements in the technology of personal communications over the last several decades, analysis of information derived from communication instruments, particularly cellular telephones, has become an integral component of crime scene investigations. With over 180 million cellular phone users in the United States alone, there are currently five hundred (500) different cellular phones being offered for service from over thirty (30) different manufacturers, processing data through at least four (4) carriers. Acquisition and analysis of this personal communication device information must be accomplished quickly (efficiently and effectively), since time is always of the essence at a crime scene. Given such variance in communication and computing platforms of these personal devices, there are at least twelve (12) different forensic toolkits available for the extraction of digital cellular phone evidence. Some products focus specifically on one manufacturer, some focus on specific features, and others focus on specific components such as the GSM SIM cards. The first responder is currently limited in acquiring and analyzing the time-sensitive personal communication device information as they arrive at crime scenes, being forced to secure the device and send it back to a forensic lab, where the evidence loses most of its time value. As a consequence, there is a critical need to expand the technological capabilities of existing forensic tool kits such that they will be able to quickly extract relevant evidentiary information from any personal communications devices, independent of manufacturer, data processing, or type of instrument. Our long term research goal is to understand how information from technological innovations in personal communications can be optimally utilized to advance society and to interest students' in future professional careers in the technology of communication. It is our objective to develop the mobile technology that would allow for the immediate acquisition and analysis of evidentiary information from personal communication devices, such as cellular telephones. Our rationale for this project is that the development of this technology would be expected to markedly improve the time value of evidence extracted from such communication devices, and would provide investigators with timely digital evidence that may well lead to the quick identification of other criminal activity, potentially allow early identification of law-breakers, or actually save lives of potential victims. My qualifications to pursue this project include having served for the past four years as an investigator, educator, and trainer in small scale digital device forensics, twelve years as a United States military electronics warfare officer, and as a reviewing editor for three recent (2004-2005) N.I.S.T. documents on PDA and cellular phone forensics. For the past twenty years, Kyle Lutes of Purdue University has developed mobile application software for such industries as banking, telecommunications, publishing, hospitals, medical schools, retail, and pharmaceuticals. Karl Dunnagan, a Cellular Phone Investigator from Mobile Forensics, Inc., has fourteen years experience as a deputy with Los Angeles Sheriff's Department and the Southern California High Tech Crimes Task Force. Amber Schroader, the CEO of Paraben Forensics has overseen the development of the most extensive software applications for handheld device forensics used by the government, military and private sectors. Purdue University's College of Technology provides the applied research facilities required to develop and prototype the proposed project. We will develop this device by pursuing the following research objectives: Objective #1: Establish a hierarchical knowledge base of all information regarding the various personal communication devices, their technical specifications and images, and the forensically sound techniques for acquiring and analyzing these devices. Our hypothesis is that the combination of the current information that exists in various forms and from various sources, correlated with specific applicable forensic techniques could provide an immediate intelligent system for forensically sound acquisitions and analyses of personal communication devices. Objective #2: Develop the guidance system application for the information from Objective #1 to facilitate the first responders' immediate acquisition and analysis of personal communication devices. Based on the information gleaned from Objective #1, our hypothesis is that an intelligent system can be built based on a hierarchy of gained knowledge and an analogous predictive system that provides guidance based on previous forensic experiences or invariant representations of such forensic investigations. The development of this application is the proof of concept, showcasing the utilization of the provided knowledge to quickly acquire and analyze evidentiary information found on a personal communications device such as cellular phone at a crime scene. This project is particularly innovative because its development would be the first technological intervention that would provide first responders with an easily-usable tool for in-field fast forensics triage of personal communications devices such as cellular phones. After completion of this project, it is our expectation that we will have created a truly functional handheld forensics tool that will combine the utility of previous forensic instrumentation, state-of-the-art technical information and imagery, and an intelligent guidance system providing for the immediate acquisition and analysis of evidentiary information from personal communication devices.

Hidden Disk Areas

Enclave and Network Security
Mayank R. Gupta, Michael Hoeschele, Marcus K. Rogers
This research focuses on certain manufacturer hidden areas of a hard disk, specifically Host Protected Areas (HPA) and Device Configuration Overlays (DCO). These areas can be problematic for computer forensic investigators, since many of the common industry tools cannot detect the presence of the HPA and DCO. A review of the ATA specifications and recent white papers indicate that these areas can be accessed, modified, and written to by end users using specific open source and freely available tools; allowing data to be stored and/or hidden in these areas. This greatly increases the risk that image acquisitions may not be a true copy of the physical drive in question. This also could result in the obfuscation of data, leading to incomplete or erroneous investigative conclusions.

Information Leaks and Privacy in Web Services Computing

Trusted Social and Human Interactions
Ashish Kundu
We show that information leaks are inherent in object models based on subtyping and inclusion polymorphism. Web services interact with other systems across organizational boundaries using such an object model. In the context of web services, information leaks pose serious security and privacy concerns. A safe web service is one which neither is a source of any information leak nor exploits any information leak. We define properties of such a safety model and propose mechanisms to enforce the safety requirements. Leaks inherent in the programming paradigm however cannot always be completely masked while keeping the desired interoperability and flexibility of services intact, especially in compositional scenarios. Therefore the presentation proposes the use of processes of service certification and versioning aided by data flow analysis as measures against, and a cost estimation model in case of information leaks.

Is Data Mining Dangerous?

Identification, Authentication and Privacy
Maurizio Atzori, Francesco Bonchi, Fosca Giannotti, Dino Pedreschi
At a first sight, it may seem that data mining results do not violate the anonymity of the individuals recorded in the source database. In fact, data mining models and patterns, in order to ensure a required statistical significance, represent a large number of individuals and thus conceal individual identities. We show that this belief is ill-founded: data mining can be dangerous for the anonymity of individuals. By shifting and extending the k-anonymity concept from databases to pattern discovery, we present a formal way to define anonymity breaches raised by the data mining results. We also develop algorithms to discover such breaches and remove them from the final results, without compromising the quality of the output.

k-Anonymity Privacy Protection: Questions without Answers

Identification, Authentication and Privacy
M. Ercan Nergiz, Chris Clifton
k-Anonymity is a method for providing privacy protection by ensuring that data cannot be traced to an individual. In a k-anonymous dataset, any identifying information occurs in at least k tuples. To achieve optimal and practical k-anonymity, recently, many different kinds of algorithms with various assumptions and restrictions have been proposed with different metrics to measure quality. This paper presents the family of clustering based algorithms that are more flexible and even attempts to improve precision by ignoring the restrictions of user defined Domain Generalization Hierarchies. The main finding of the paper will be that metrics may behave differently through different algorithms and may not show correlations with some applications' accuracy on output data.

K-anonymity using Clustering

Identification, Authentication and Privacy
Elisa Bertino, Ninghui Li, Ji-Won Byun, Ashish Kamra
k-anonymization techniques are a key component of any comprehensive solution to data privacy and have been the focus of intense research in the last few years. An important requirement for such techniques is to ensure anonymization of data while at the same time minimizing the information loss resulting from data modifications such as generalization and suppression. Current solutions, however, suffer from one or more of the following limitations: reliance on pre-defined generalization hierarchies; generation of anonymized data with high information loss and with high classification errors; and the inference channel arising from lack of diversity in the sensitive information. In this work we propose an approach that addresses these limitations. Our approach uses the idea of clustering to minimize information loss and thus ensure good data quality. The key observation here is that data records that are naturally close with respect to each other should be part of the same equivalence class. Current clustering techniques, however, are not directly applicable in this context because they do not consider the requirement that each cluster should contain at least k records. We thus formulate a specific clustering problem, referred to as k-member clustering problem. We prove that this problem is NP-hard and present a greedy algorithm, the complexity of which is in O(k*n). As part of our approach we develop a suitable metric to estimate the information loss introduced by generalizations, which works for both numeric and categorical data. We also present extensions to our proposed algorithm that minimize classification errors in the anonymized data and eliminate the inference channel arising from lack of diversity in the sensitive attributes. We experimentally compare our algorithm with two recently proposed algorithms. The experiments show that our algorithm outperforms the other two algorithms with respect to information loss, classification errors, and diversity.

Lightweight intrusion detection for sensornets

Identification, Authentication and Privacy
Vijay Bhuse, Ajay Gupta, Leszek Lilien
Sensornets are envisioned for use in a wide variety of applications interfacing the physical world to the cyberspace. The sensor nodes are resource-constrained in terms of battery power, radio range, processor speed and memory. Battery power is not only limited but replacing battery may not be possible in many situations. Hence, increasing network lifetime by thrifty energy use is the design goal for many sensornet applications. The sensors are mostly unguarded, not protected nearly as well as nodes in other types of networks. Furthermore, the wireless medium is inherently broadcast-based and hence insecure. Cryptographic solutions that provide secure channels of communication are too expensive computationally for sensor nodes. Even though private key cryptography can be used, an adversary can get the secret key by physically capturing just a single sensor node knowing this key. Hence, intrusion detection techniques that identify and isolate attackers should be developed. To minimize energy consumption, these techniques must be lightweight We focus on design of lightweight detection techniques for common kinds of attacks, including packet dropping, masquerades, and unacceptable information sources. Our future work will involve design of mechanism for detection of Sybil attacks, blackholes, and code tampering.

Ontological Semantic (OS) Support for Digital Identity Management (DIM): Expanding the Ever-Expanding Domain

Risk Management, Policies and Laws
John M. Spartz, Evguenia Malaia
Digital identity management (DIM) has emerged as a critical foundation for supporting successful interaction in today's globally interconnected society. It is crucial for not only conducting business and the government, but also for a large and growing body of electronic or online social interactions. In its broadest sense, identity management encompasses definitions and life-cycle management for digital identities and profiles, and the environments for exchanging and validating such information, including anonymous and pseudonymous representations. Although the basic tools underlying identity management have existed for a long time, we still lack comprehensive, dependable and flexible solutions for supporting multiple and partial identities. Moreover, support for anonymity, a key requirement for digital identity systems, should not undermine the dependability of the system and the accountability of the interacting parties. DIM systems should thus enforce good audit practices and support forensic analysis consistent with the criticality of the underlying system. Because digital identities have such varied uses and meanings, and because of the far-reaching implications of DIM policies, which extend to free speech, privacy, and online accountability, it is essential to develop a universal vocabulary for developing a digital identity framework and policy language. A firm understanding and facility with DIM vocabulary, which includes and ever-expanding and prevalent list of acronyms and DIM products, is compulsory for any and all work in identity management. We propose the framework of ontological semantic processing and text meaning representation through ontology as a possible and probable means through which one can come to possess a working knowledge of DIM and associated vocabulary.

Ontology-Based Inference Methods in Handling Privacy Policies

Risk Management, Policies and Laws
Victor Raskin, Olga Krachina
Privacy Policy is a document describing usage of personal information (PI) collected from the user. In essence, it is a text document written in natural language. Natural language introduces a share of ambiguity. Subject of this research is to explore a method of reducing ambiguity by providing possible valid inference statements, which allows user to make an informed decision. Framework chosen is Ontological Semantics due to its structure and organization. Inference method presented is based on matching text-meaning-representation (TMR) modules and utilization of such resources as Facts Database (FDB) and domain-specific Lexicons

Opportunistic Networks and Their Privacy and Security Challenges

Identification, Authentication and Privacy
Leszek Lilien, Zille Huma Kamal, Vijay Bhuse, Ajay Gupta
We present and investigate a novel paradigm and a new technology of opportunistic networks, or oppnets. An oppnet grows from its seed-the original set of nodes deployed together at the time of the initial oppnet deployment. The seed grows into a larger network by extending invitations or issuing orders to join the oppnet to other devices, node clusters, or foreign networks that it is able to contact. I this way an oppnet gains new communication, computation, sensing, and other resources. A new node that becomes a full-fledged member, or helper, may be allowed to invite external nodes. All helpers collaborate on realizing the goals of their oppnet. They can be employed to execute different kinds of tasks, even though in general they were not designed to become elements of the oppnet that invited them. We address the critical privacy and security issues as well as other research challenges in oppnets. In particular, we believe that the way privacy is addressed in pervasive computing systems, can make or break them. Oppnets, as a subcategory of such systems, are no exception. Oppnets can improve existing applications in numerous areas, and create new application niches as yet hard to imagine. Thanks to their inherent adaptability and capacity for leveraging resources, they have a great potential for improving effectiveness and efficiency of emergency response and disaster recovery.

Perceived Strength of Signatures for the Prevention of Identity Theft

Identification, Authentication and Privacy
Adam R. Hunt, Stephen J. Elliott, Ph.D.
Dynamic signature verification is a subset of that larger science that includes fingerprint recognition, hand geometry, and voice recognition. Signature verification is primarily behavioral in nature like voice recognition, but has some very unique traits which make it harder to test and evaluate. These challenges include the fact that a signature is learnt, it contains variant measures, it can be changed by the owner of the signature, and that a signer might have several versions of the signature, depending on the intent of the signer.

Policy-Driven Management and Control of Data Integrity

Assurable Software and Architectures
Elisa Bertino, Yonglak Sohn, Ji-Won Byun
Integrity has long been considered a fundamental requirement for secure computer systems, and especially today's demand for data integrity is stronger than ever as many organizations are increasing their reliance on data and information systems. A number of recently enacted data privacy regulations also require high integrity in personal data. In this paper, we discuss various issues of data integrity control and management with a primary focus on access control. We first reexamine some previously proposed integrity models and redefine a set of integrity requirements. We then present an architecture for comprehensive integrity control systems, which has its basis on data validation and metadata management. We also provide an integrity control policy language that we believe is flexible and intuitive.

Printer Characterization and Signature Embedding for Security and Forensic Applications

Identification, Authentication and Privacy
Pei-Ju Chiang, Aravind K. Mikkilineni, Sungjoo Suh, Jan P. Allebach, George T.-C. Chiu, Edward J. Delp
In today's digital world securing different forms of content is very important in terms of protecting copyright and verifying authenticity. One example is watermarking of digital audio and images. We believe that a marking scheme analogous to digital watermarking but for documents is very important.1 Printed material is a direct accessory to many criminal and terrorist acts. Examples include forgery or alteration of documents used for purposes of identity, security, or recording transactions. In addition, printed material may be used in the course of conducting illicit or terrorist activities. Examples include instruction manuals, team rosters, meeting notes, and correspondence. In both cases, the ability to identify the device or type of device used to print the material in question would provide a valuable aid for law enforcement and intelligence agencies. We also believe that average users need to be able to print secure documents, for example boarding passes and bank transactions. There currently exist techniques to secure documents such as bank notes using paper watermarks, security fibers, holograms, or special inks.8, 9 The problem is that the use of these security techniques can be cost prohibitive. Most of these techniques either require special equipment to embed the security features, or are simply too expensive for an average consumer. Additionally, there are a number of applications in which it is desirable to be able to identify the technology, manufacturer, model, or even specific unit that was used to print a given document. In this poster we will present results from our previous and current research allowing us to forensically characterize both inkjet and electrophotographic (laser) printers, and embed information into printed documents.

Privacy Preserving Biometric Authentication

Identification, Authentication and Privacy
E. Bertino, S.J. Elliott, A.Bhargav-Spantzel, A.C.Squicciarini, S. K. Modi
The problem of identity theft, that is, the act of impersonating others' identities by presenting stolen identifiers or proofs of identities, has been receiving increasing attention because of its high financial and social costs. Recent federated digital identity management systems if on one side have improved the management of identity information and user convenience, on the other side do not provide specific solutions to address identity theft. One approach to the problem of reducing the threat of identity theft is the widespread adoption of systems of biometr