The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Symposium 2007 Posters

Page Content

Assurable Software and Architectures

Cryptology and Rights Management

Enclave and Network Security

Identification, Authentication and Privacy

Incident Detection, Response, and Investigation

Security Awareness, Education and Training

Trusted Social and Human Interactions

A Dynamic Cryptographic Hash Function Construction

Cryptology and Rights Management
Speirs, William
A dynamic hash function construction creates a dynamic hash function from a traditional compression function. The dynamic hash function created from this construction can be proved to be cryptographically secure with respect to the properties of a dynamic cryptographic hash function.

A Sensor-cyber Network Testbed for Plume Detection, Identification, and Tracking

Incident Detection, Response, and Investigation
Jren-Chit Chin, I-Hong Hou, Jennifer C. Hou, Chris Ma, Nageswara S. Rao, Mohit Saxen, Mallikarjun Shankar, Yong Yang, David K. Y. Yau
The goal of this work is to design, realize, evaluate, and deploy a detection, identification, and tracking sensor cyber network (DITSCN) for chemical and radiational plumes. The current focus is on building a system of radiation sensors inter-connected by wireless links for detecting the presence of radioactive materials, identifying the radiation source, and tracking their propagation over time.

A System for the Specification and Enforcement of Quality-based Authentication Policies

Identification, Authentication and Privacy
A. Squicciarini, A. Czeskis, E. Bertino, A. Bhargav-Spantzel, M. Almomen
Many application environments require different authentication strengths depending on the resources that subjects need to access. To date no high level policy language exists allowing one to state the authentication requirements. We developed such a language and a reference architecture supporting the management of authentication policies expressed in this language and their enforcement. The proposed system directly support multi-factor authentication and supports the high level specification of authentication factors, in terms of conditions against the features of the various authentication mechanisms and modules. In addition the language supports a rich set of constraints; by using these constraints, one can specify for example that a subject must be authenticated by two credentials issued by different authorities. Our work reports a logical definition of the language and its corresponding XML encoding. In addition, we report an implementation of the proposed authentication system in the context of the FreeBSD Unix operating system. Critical issues in the implementation are also discussed and performance results are reported. These results show that the implementation is very efficient.

A Taxonomy of Generalization Schemes for Data Anonymization

Identification, Authentication and Privacy
Tiancheng Li, Ninghui Li
In recent years, a major thread of research on k-anonymity has focused on developing more flexible generalization schemes that produce higher-quality datasets. We introduce three new generalization schemes that improve on existing schemes, as well as algorithms enumerating valid generalizations in these schemes. We also introduce a taxonomy for generalization schemes and a new cost metric for measuring information loss. We present a bottom-up search strategy for finding optimal anonymizations. This strategy works particularly well when the value of k is small. We show the feasibility of our approach through experiments on real census data.

An Architectural Approach to Preventing Code Injection Attacks

Assurable Software and Architectures
Ryan Riley, Xuxian Jiang, Dongyan Xu
Code injection attacks, despite being well researched, continue to be a problem today. Modern architectural solutions such as the NX-bit and PaX have been useful in limiting the attacks, however they enforce program layout restrictions and can often times still be circumvented by a determined attacker. We propose a change to the memory architecture of modern processors that addresses the code injection problem at its very root by virtually splitting memory into code memory and data memory such that a processor will never be able to fetch injected code for execution. This virtual split-memory system can be implemented as a software only patch to an operating system, and can be used to supplement existing schemes for improved protection. Our experimental results show the system is effective in preventing a wide range of code injection attacks while incurring acceptable overhead.

BANBAD: Bayesian-Network-Based Anomaly Detection for MANETs

Enclave and Network Security
Chaoli Cai, Ajay Gupta, Leszek Lilien
Numerous approaches have been proposed for wireless network intrusion detection, especially for anomaly detection in mobile ad-hoc networks (MANETs). However, little research work has been done in actually implementing such schemes based on statistical methods. We propose an efficient anomaly detection algorithm using a statistical method based on Bayesian Networks (BN), which can effectively identify abnormal behaviors of MANETs. Some of the major shortcomings of previous work are high false alarm rates and low detection rates when the mobility is fairly low, dependence of detection and false alarm rates on the velocity of nodes, and costs of creating and updating dynamic profiles. Our approach overcomes these drawbacks by using BNs. Two application models, namely the chain model and the DAG model, are used for the training and the testing processes for the belief propagation algorithm. An up-to-date dynamic profile can then be easily maintained after the training process. Our simulation results show that the proposed algorithm exhibits good performance in terms of a low false alarm rate and a high detection rate. In particular, our solution achieves similar high detection rates (≥ 90%) while decreasing the false alarm rates to at most 6%.

Bio-Key : Privacy Preserving Biometric Authentication

Identification, Authentication and Privacy
E. Bertino, S. Elliott, A.Bhargav-Spantzel, M. Young, S. Modi,A. Squicciarini
The goal is to provide a privacy preserving methodology for strong biometric authentication in federated identity management systems.

Privacy Preserving Multifactor Authentication [1]: multifactor authentication is essential for secure authentication mechanisms. The identity management framework is used to provide proofs of multiple strong identifiers for a given user.

Interoperability: Our scheme provides an interoperable, usable, secure, and inexpensive to use biometric authentication in a federation.

User Control : The raw biometric never leaves the client machine therefore providing complete control to its owner.

BioAPI Java project

Assurable Software and Architectures
Watson, Keith
The Center for Education and Research in Information Assurance and Security and the Biometric Standards, Performance, and Assurance Laboratory at Purdue University are working in conjunction with the BioAPI Consortium to develop a reference implementation of the BioAPI 2.0 specification in the Java(tm) programming language.

Biometrics and Privacy

Identification, Authentication and Privacy
William Eyre, Sean Sobieraj, Dr. Steven Elliott
Use of biometrics in schools for identification of students younger than 16 in the United Kingdom has sparked privacy concerns. There are privacy advocates active in attempting to derail the use of biometric authentication in schools for a variety of reasons. These reasons are explored and solutions are discussed. The paper also examines how data is linked or dereferenced in terms of biometrics and personally identifiable data.

Biometrics Over a Wide Area Network (WAN)

Identification, Authentication and Privacy
Nathan W. Dunning, Matthew R. Young, Eric P. Kukula, Stephen J. Elliott, Ph.D.
Every semester, undergraduate students who enroll in either an independent study or the biometrics course engage in applied research. One such example of applied research is to engage the local community in providing biometric solutions to specific problems. This paper evaluated the acceptability of implementing hand readers across a wide area network to provide access to members of the Tippecanoe Sheriff

Bit-Level Analysis of Cryptographic Functions

Cryptology and Rights Management
William Spiers, II and Ian Molloy
Our work focuses on analyzing cryptographic functions at the bit-level. We can process any cryptographic function with a fixed set of inputs and output resulting in a Boolean circuit. The resulting circuit can be compare to other circuits or turn into a system of equations that are then solved to break the function.

Completely-Secure Sharing of Trees and Hierarchical Content

Identification, Authentication and Privacy
Ashish Kundu, Elisa Bertino
One of the most widely used information structures are trees. Notable examples include XML data objects and complex database objects. Such tree structures are used in specification of sensitive information such as healthcare and biological, defense and satelite data (space-based computing). Sharing of sensitive information between a producer and a consumer (or consumers) entails strong confidentiality and integrity semantics: (1) Information including the structural information must shared in a controlled manner - no leak should happen and (2) verification of integrity should be possible at a node level - precise verification should be supported. Precise verification of integrity facilitates efficient data recovery, failure-oblivious computing in case of faulty (compromised) data and more precise determination of abnormal process termination. The most fundamental of the technique used for such purpose is Merkle hash. However the Merkle hash technique does not preserve confidentiality of data because it uses cascaded hashing that is non-associative in nature: it releases unauthorized structural information to the consumer during integrity verification. The challenge lies in enabling support for both confidentiality and precise verification of integrity in a holistic, simple and in-expensive manner.

We propose a security framework that exploits the structural properties of information trees, using which information leakages can be prevented and integrity violations can be precisely detected at the level of a node. In our approach, the structural aspects of such information objects are characterized by the simple notions of post-order, pre-order and in-order numbers and their randomized derivatives - encrypted post-order, pre-order and in-order numbers. By applying these notions to such hierarchically organized data, a unique signature for each information unit in a tree is generated. Our technique overcomes the drawback of Merkle hash technique and the signatures support consumer-side precise verification of integrity. The runtime complexity of the proposed technique is O(n), where 'n' is the number of nodes in the tree; the technique also does not make use of the cascaded hashing as used by Merkle hash and thus is more efficient. We have shown (proofs are beyond the scope of the poster) that the proposed notions are powerful enough to meet the security requirements essential for completely secure sharing of tree-based data and trustable computing, which is of great importance especially to the applications related to defense, healthcare and space-based computing.

Data Reduction Techniques for Event Sequences

Incident Detection, Response, and Investigation
Mikhail Atallah, Emil Stefanov, Wojciech Szpankowski
Whether they are the audit trails of the events in a computer system, of traffic in a network, of actions by individuals, or records of financial transactions monitored for internal compliance by a financial corporation (or monitored externally by the SEC or FBI), records of events tend to be massive. In this haystack of events, can lie buried valuable information whose extraction would be easier if the event record could be reduced. This paper is a step in this direction, in that it gives an algorithm that takes as input a sequence of events that were generated by k separate Markov reference models, and separates it into k sequences each of which corresponds to the sub-sequence generated by one of the k reference models. We also argue that such separation is not possible when the reference models are reversible (a class which includes Bernoulli models). The input to the algorithm does not include the state space or transition matrix of any of the k reference models, nor does it include the parameters that were used to mix their respective outputs and produce the merged sequence. We also report experimental results demonstrating that our algorithm is both fast and accurate. Somewhat surprisingly, our techniques work remarkably well for non-Markovian souces.

Database Anomalous Usage Detection

Incident Detection, Response, and Investigation
Ashish Kamra and Dr. Elisa Bertino
Data security has been an important research area in the recent past. Along with other data protection mechanisms, intrusion detection systems for databases have also started garnered attention. In this work, we propose such a system for relational databases. Our system detects anomalous usage behavior of database users/applications by creating "normal" models out of SQL queries and context information surrounding them. An anomaly is then defined as an access request that deviates from the normal model. We also propose an intrusion response mechanism for dealing with these anomalies. The response mechanism consists of a policy language and a set of pre-defined response actions. We also propose to create a feedback mechanism such that the intrusion response engine learns from its past responses.

Dynamic Signature Verification and Forgery

Identification, Authentication and Privacy
C.R. Blomeke, J.R. Padfield, C. J. Bane, S.J. Elliott, Ph.D
This study outlined nine levels of forgery training in an attempt to evaluate whether the graphical representation as well as speed, pressure, and pen distance traveled became closer to the measures of the genuine signature over the levels. With the increasing levels, the forgery population received additional information about of the genuine signature, from seeing the genuine signature on paper to viewing a video of the authentic signature generation. The analysis indicated that the forgery population was able to mimic 14 variables of the genuine signature at level 3, the most at any level. The variables of speed and pen distance were identified as useful variables in the detection of a tracing of a genuine signature.

Dynamic Virtual Credit Card Numbers

Cryptology and Rights Management
Ian Molloy, Jiangtao Li, and Ninghui Li
Theft of stored credit card information is an increasing threat to e-commerce. We propose a dynamic virtual credit card number scheme that reduces the damage caused by stolen credit card numbers. A user can use an existing credit card account to generate multiple virtual credit card numbers that are either usable for a single transaction or are tied with a particular merchant. We call the scheme dynamic because the virtual credit card numbers can be generated without online contact with the credit card issuers. These numbers can be processed without changing any of the infrastructure currently in place; the only changes will be at the end points, namely, the card users and the card issuers. We analyze the security requirements for dynamic virtual credit card numbers, discuss the design space, propose a scheme using HMAC, and prove its security under the assumption the underlying function is a PRF.

Enabling Confidentiality for Group Communication on Wireless Mesh Networks

Enclave and Network Security
Jing Dong and Cristina Nita-Rotaru
Wireless mesh networks (WMNs) have emerged as a promising technology for providing low-cost community wireless services. Despite recent advancement in securing wireless networks, the problem of secure group communication on wireless networks has received relatively little attention. Characteristics specific to WMNs, such as limited communication range and high link error rate, raise unique challenges in designing such protocols.

In this presentation we focus on providing data confidentiality for group communications on WMNs. We introduce a secure overlay based protocol framework for group data confidentiality designed to address the unique characteristics of the WMN environment. We also demonstrate the experimental results which evaluate the performance of the proposed protocols and compare the proposed protocols with the protocol adapted directly from the wired networks.

Explicit formulas for real hyperelliptic curves of genus 2 in affine representation

Cryptology and Rights Management
S. Erickson, M.J. Jacobson, Jr., N. Shang, S. Shen, A. Stein
We present for the first time efficient explicit formulas for arithmetic in the degree 0 divisor class group of a real hyperelliptic curve. Hereby, we consider real hyperelliptic curves of genus 2 given in affine coordinates over finite fields. These formulas are much faster than the optimized generic algorithms for real hyperelliptic curves and the cryptographic protocols in the real setting perform almost as well as those in the imaginary case. We provide the idea for the improvements and the correctness together with a comprehensive analysis of the number of field operations. We perform a direct comparison of cryptographic protocols using explicit formulas for real hyperelliptic curves with the corresponding protocols presented in the imaginary model.

Extension of NLP Techniques for Privacy Management

Identification, Authentication and Privacy
Olga Krachina, Victor Raskin
Certain informaition management tasks in the domain of Privacy are better suited for Natural Language Processing (NLP) systems rather than a formal privacy language. This is due to the fact that formal languages are domain specific and by design target a limited level of expressiveness as compared to a comprehensive NLP system. This poster further motivates usage of NLP system in the domain on an example of a specific framework.

Forensics of Things

Identification, Authentication and Privacy
Nitin Khanna, Anthony F. Martone, Aravind K. Mikkilineni, Jan P. Allebach, George T.-C. Chiu, Edward J. Delp
Forensic characterization of devices is important in many situations such as establishing the trust and verifying authenticityof data and the device that created it. Current forensic identification techniques for digital cameras, scanners and printers are highly reliable due to the fact that each of thesedevices cannot escape inherent electro-mechanical properties which add

FREEAK - Forensic Rapid Evidence Extraction Analysis Kit

Incident Detection, Response, and Investigation
Shira Dankner, Matt Garrett, Rick Mislan, Kyle Lutes, Marc Rogers,
FREEAK - Forensic Rapid Evidence Extraction Analysis Kit

Generalized Spatio-Temporal Role Based Access Control Model

Assurable Software and Architectures
Arjmand Samuel, Arif Ghafoor and Elisa Bertino
We are witnessing an exponential growth in the use of mobile computing devices such as laptops, PDAs and mobile phones, accessing critical data while on the move. The need to safeguard against unauthorized access to data in a mobile world is a pressing requirement. Access to critical data depends on users' identity as well as environmental parameters such as time and location. While temporal based access control models are well suited for enforcing access control decisions on fixed users, they loose their effectiveness when users employing mobile computing devices are not fixed in space and are moving from a secure locale to an insecure one, or vice versa. Issues of location as a context parameter for access control have been addressed by a number of researchers but definition of rich spatial constraints which effectively capture semantics and relationship of physical and virtual (e.g. membership to an IP group) locales is still missing. Our contribution in this poster is the development of the Generalized Spatio-Temporal Role Based Access Control (GST-RBAC) model, by proposing a formal framework for composition of complex spatial constraints exploiting topological relationship between physical and virtual locales. Spatial constraints are defined for spatial role enabling, spatial user-role assignment, spatial role-permission assignment and spatial activation of roles. The notion of spatial separation of duty is also developed whereby a user is not permitted to activate two roles simultaneously if the roles are being activated from specific locales. Another feature of the proposed GST-RBAC is the spatial role hierarchy, which allows inheritance of permissions between roles, contingent upon roles being activated from predefined locales.

Have you updated your wireless card drivers lately?

Assurable Software and Architectures
Ryan Riley
Current research on 802.11 mainly focuses on flaws in the protocol. Handling problems in the protocol, however, is not sufficient for providing a secure 802.11 environment for end users. In this work I explore flaws in the 802.11 implementations contained within operating systems, firmware, and drivers. I give a survey of current work in the area and present a client based sniffer and attack tool that I built to test some of the latest exploits.

High-Fidelity DoS Simulation and Emulation Experiments

Enclave and Network Security
Roman Chertov, Sonia Fahmy, and Ness B. Shroff
Simulation, emulation, and wide-area testbeds exhibit different strengths and weaknesses with respect to fidelity, scalability, and manageability. Fidelity is a key concern since simulation or emulation inaccuracies can lead to a dramatic and qualitative impact on theresults. A high-bandwidth Denial of Service (DoS) attack can produce very different impacts on the different platforms, even if the experimental scenario is supposedly identical. This is because many popular simulation and emulation environments fail to account for realistic commercial router behaviors, and incorrect results have been reported based on experiments conducted in these environments.

In this work we describe the architecture of a black-box router profiling (BBP) tool which integrates the popular ns-2 simulator with the Click modular router and a modified network driver. We use thisprofiler to collect measurements on Cisco routers. Through a careful sensitivity analysis, we demonstrate that routers and other forwardingdevices cannot be modeled as simple output port queues, even if correct rate limits are observed. In our future work we plan to use our data to create high--fidelity network simulation/emulation models that are not computationally prohibitive.

Improving the Privacy and Security of Online Survey Data Collection, Storage, and Processing

Identification, Authentication and Privacy
Mikhail Atallah, Juline Mills, Marina Blanton, Celestino Ruffini, Richard Wartell
The goal of this project is to develop techniques that protect the privacy of survey participants

Integrating the Common Weakness Enumeration into a Secure Programming Course

Security Awareness, Education and Training
Pascal Meunier
The exhaustiveness and organization of the CWE coverage is attractive both as an educational tool, and to make sure that students are exposed to secure programming issues in a systematic way that is representative of the most frequent and important problems. This poster presents lessons learned while revising the secure programming slides with CWE content. In particular, there are concepts absent from the CWE that require creating a special "view" of the CWE. A view is a reorganization of chosen CWE nodes in a systematic manner. Creating views takes a lot of time, but highlights missing CWE nodes and is useful for the CWE effort as well. The resulting strong linkage of concepts to a systematic empirical collection of issues is advantageous for teaching secure programming.

iPod Forensics

Security Awareness, Education and Training
Matthew Kiley, Tim Shinbara and Marcus K. Rogers
From student to business worker, the popularity and ubiquity of mobile devices is exploding. As these devices saturate modern culture, they continue to grow in functionality. Such devices can now play music, store photos, contacts, or files and even play full-length movies. Apple's iPod has incorporated all of this into a single device. With increased popularity however, criminals have found ways to exploit an otherwise altruistic device. The challenge that lies before law enforcement now becomes identifying the evidence an iPod may contain, and which forensic tools are able to acquire this evidence.

Mobile Phone Image Evidence

Incident Detection, Response, and Investigation
Sean Sobieraj, Richard Mislan
Research regarding Mobile Phone Image Evidence

MultiRelational k-Anonymity

Identification, Authentication and Privacy
M. Ercan Nergiz, Chris Clifton, A. Erhan Nergiz
k-Anonymity protects privacy by ensuring that data cannot be linked to a single individual. In a k-anonymous dataset, any identifying information occurs in at least k tuples. Much research has been done to modify a single table dataset to satisfy anonymity constraints. This work extends the definitions of k-anonymity to multiple relations and shows that previously proposed methodologies either fail to protect privacy, or overly reduce the utility of the data, in a multiple relation setting. A new clustering algorithm is proposed to achieve multirelational anonymity.

On the Accuracy of Decentralized Network Coordinate Systems in Adversarial Networks

Enclave and Network Security
David Zage and Cristina Nita-Rotaru
Network coordinate systems allow hosts on the Internet to determine the latency to arbitrary hosts without using explicit measurements. Such systems assign virtual coordinates to hosts, where the distance between the coordinates of two hosts accurately predicts the communication latency between them. Many of the proposed systems were designed with the assumption that all of the nodes in the network coordinate system are cooperative. However, this assumption may be violated by compromised nodes acting maliciously to degrade the accuracy of the coordinate system.

In this work, we demonstrate the vulnerability of distributed network coordinate systems to insider (or Byzantine) attacks. We propose low-communication techniques to make the coordinate assignment robust to malicious attackers. Our solutions decrease the number of incorrect coordinates through the use of spatial outlier detection. We demonstrate the attacks and mitigation techniques in the context of a well-known distributed virtual coordinate system using a simulated network of 1740 Internet hosts and the corresponding latencies.

Open and Shut? An Analysis of Vulnerability Discovery Models for Open Source and Proprietary Software

Incident Detection, Response, and Investigation
Kemal Altinkemer, Fariborz Farahmand, Jackie Rees, Chen Zhang
This research examines vulnerability discovery models of both open source and proprietary software development approaches for various operating systems and performs an empirical analysis to examine the differences in the discovery process between the development paradigms. While there isn

PHPSecInfo: Security Auditing for the PHP Environment

Assurable Software and Architectures
Edward Finkler
PHPSecInfo is an easy-to-install security auditing and reporting tool for the PHP environment.

Poly^2 Application Nodes

Assurable Software and Architectures
Keith Watson, Robert Winkworth
A significant challenge faced by many organizations today is the enforcement and evaluation of trust in complex networks built using a variety of commercial off-the-shelf and freeware components, and running on one or more general purpose operating systems. We will address this problem by simplifying, modularizing, and separating functionality to the extent that Poly^2 components have the minimum functionality needed for a particular network task, and interactions between components are confined to known, defendable contexts. The Poly^2 research project advances the understanding of building secure and reliable system architectures to support critical services in hostile network environments. A secure and reliable system architecture must only provide the required services to authorized users in time to be effective. The Poly^2 architecture we propose will be based on sound, widely acknowledged security design principles. It will form the basis for providing present and future network services while, at the same time, being highly robust and resistant to attack. A prototype of the new architecture will be developed that will provide traditional network services (e.g. web, FTP, email, DNS, etc.) using commodity hardware and an open source operating system. All along, security analyses will be conducted to see if there are flaws in the design or implementation and whether those flaws are the result of conflicting requirements or design objectives. The resulting implementation will serve as a secure and highly available platform from which organizations can deploy their own critical network services. Poly^2 research will provide the community with a better understanding of the application of sound security design principles, the decomposition of COTS software components to increase trust, separation of data based on design and policy, and the isolation of services to decrease commonality and contain failures and attacks. Further, it can provide a springboard for additional research in areas such as intrusion detection, security evaluation, and performance metrics.

Preventing DDoS Attacks with P2P Systems

Enclave and Network Security
Xin Sun, Ruben Torres, Sanjay Rao
In this work we try to establish security principles that should be considered when designing any P2P systems. We first examine a range of today's most widely deployed P2P systems, including BitTorrent and eMule for file distribution, and ESM for video broadcasting, and show vulnerabilities in those systems, which can be exploited for large-scale high-volume Distributed Denial-of-Service attacks to the Internet environment. We then identify several design principles to prevent such vulnerabilities. We show that those principles are extremely important for the design of all kind of Peer-to-Peer systems and evaluate their impact in the application performance.

Realizing Privacy-Preserving Features in Hippocratic Databases

Assurable Software and Architectures
Yasin Laura-Silva and Walid G. Aref
Preserving privacy has become a crucial requirement for operating a business that manages personal data. Hippocratic databases have been proposed to answer this requirement through a database design that includes responsibility for the privacy of data as a founding tenet. We identify, study, and implement several privacy- preserving features that extend the previous work on Limiting Disclosure in Hippocratic databases. These features include the support of multiple policy versions, retention time, generalization hierarchies, and multiple SQL operations. The proposed features facilitate in making Hippocratic databases one step closer to fitting real-world scenarios. We present the design and implementation guidelines of each of the proposed features. The evaluation of the effect in performance shows that the cost of these extensions is small and scales well to large databases.

ReAssure: Virtual Imaging Instrument for Logically Destructive Experiments

Assurable Software and Architectures
Michael Yang, Pascal Meunier, Jan Vitek
ReAssure is a facility available to Purdue campuses and friends of CERIAS, funded by an NSF MRI grant. Faculty, graduate students and their collaborators (e.g., at other universities) can use the ReAssure facility for classes (graduate or undergraduate), or research and experimentation. It has two functions: a repository of public images of virtual machines, and an experimental testbed setup for contained experiments. This poster presents the internal organization of ReAssure.

Runtime Intrusion Diagnosis Under Uncertainty

Enclave and Network Security
G. Modelo Howard, Y. Wu, B. Foo, M. Glause, S. Bagchi, G. Lebanon, E. Spafford
A central part of an intrusion response system is concerned with reasoning under uncertainty. Detectors may issue false alerts or may miss some alerts and our model for representing a multistage attack may be incomplete. These uncertainties need to be formally aggregated to yield a probabilistic estimate of the current attack sequence and to determine an appropriate response. In our work, we design approaches based on Bayesian learning to reason about the attack under the above-mentioned sources of uncertainty. We also provide a method to grow the knowledge base of our intrusion response system, ADEPTS, based on observed alerts as evidence.

Secure and Scalable Dissemination of XML Content with Frequent Incremental Updates

Assurable Software and Architectures
Elisa Bertino, Mohamed Nabeel, Ashish Kundu
The hallmark of the problems, to which we assure security in a holistic manner, has the interesting property of frequent incremental updates. Stock market dissemination of quotes and trades involving thousands of instruments and more than five figure updates per second is a good example of such problems. Our approach as a whole, to the best of our knowledge, is the first of its kind. The idea is to provide an efficient and scalable dissemination while maintaining key security requirements confidentiality, integrity, availability and completeness without requiring the publishers to be trusted. For this problem domain, we can dramatically reduce the bandwidth requirements by using delta messaging in a content based pub/sub system. The key challenge is to assure security with partial messages exchanged. We live up to the challenge by enforcing security for confidentiality and content integrity at the granularity of XML node level. However, we need to address structural integrity and completeness with respect to the complete XML document to which users have access to. We introduce a naval approach to reason about structural integrity of the XML documents with two levels of granularity based on well-known XPath. Leveraging the frequency of messages, we propose a probabilistic approach based on oblivious transfer to verify the completeness. Our complete solution takes every possible measure to minimize indirect information leakage by making the rest of the structure of XML documents to which clients don

Secure Collaboration in Mediator-Free Environments

Identification, Authentication and Privacy
Mohamed Shehab, Elisa Bertino, Arif Ghafoor
Multi-domain application environments where distributed domains interoperate with each other is a reality in Web services based infrastructures. Collaboration enables domains to effectively share resources; however it also introduces several security and privacy challenges. In this talk, we will define secure collaboration then we will present our framework for secure collaboration in mediator-free environments which is based on discovering and accumulating access control paths. We will also present our multihop SOAP messaging protocol that enables domains to discover secure access paths to roles across domain boundaries. Using the SOAP-DISG standard we will present our path authentication mechanism. We will also present a service discovery protocol that will enable domains to discover service descriptions stored in private UDDI registries. Furthermore, we will present our Role Routing Protocol (RRP), which is a proactive access path discovery protocol that enables domains to discover secure minimal length access paths.

T-Closeness: Privacy Beyond k-Anonymity and l-Diversity

Identification, Authentication and Privacy
Ninghui Li, Tiancheng Li, Suresh Venkatasubramanian
The k-anonymity privacy requirement for publishing microdata requires that each equivalence class (i.e., a set of records that are indistinguishable from each other with respect to certain "identifying" attributes) contains at least k records. Recently, several authors have recognized that k-anonymity cannot prevent attribute disclosure. The notion of l-diversity has been proposed to address this; l-diversity requires that each equivalence class has at least l well-represented values for each sensitive attribute. In our work we show that l-diversity has a number of limitations. In particular, it is neither necessary nor sufficient to prevent attribute disclosure. We propose a novel privacy notion called t-closeness, which requires that the distribution of a sensitive attribute in any equivalence class is close to the distribution of the attribute in the overall table (i.e., the distance between the two distributions should be no more than a threshold t). We choose to use the Earth Mover Distance measure for our t-closeness requirement. We discuss the rationale for t-closeness and illustrate its advantages through examples and experiments.

The Effect of Image Background Color on Face Recognition Performance

Identification, Authentication and Privacy
Mahesh Babu, Dennis McAndrews, Stephen J. Elliott
Current research in face recognition has not addressed the effect image background color has on the accuracy and precision with which the FRS performs matches. While the current data capture and storage standards for face recognition (INCITS 385) recommends a uniform 18% gray background to optimize FRS performance. On the other hand, current identification schemes that use facial information (such as passports and drivers licenses) either require different color backgrounds or only specify background uniformity and not color. This study intended to explore the rationale behind this disparity and whether it would have an impact on FRS performance. Results from the study show that using a different background color for enrollment and verification did have an impact on the performance of the FRS system. Future work in this area would involve replicating this study using FRS systems that use different face detection and recognition techniques.

The Effects of Human Interaction on Biometric System Performance

Trusted Social and Human Interactions
Eric P. Kukula, Stephen J. Elliott
Biometrics is defined as the automated recognition of individuals based on their behavioral and biological characteristics. Biometrics by definition requires individuals to provide information, such as a fingerprint, hand shape, or iris, to a sensor. This interaction is critical, as many times, how to use a sensor might not be intuitive to the inexperienced user. The Human-Biometric Sensor Interaction is a conceptual model that is centered on the relationship between the human providing the biometric trait and the biometric sensor; a relationship that becomes more critical to understand and research as biometric sensors become pervasive in society. The successful deployment of biometric systems, regardless of application, needs to take into consideration how individuals interact with the device. Failure to do so will cause a degradation of the optimal performance of the biometric sensor, causing problems such as: failure to acquire, failure to enroll, and impacts in the false reject rate. This research investigates whether the integration of anthropometric, ergonomic, and usability characteristics can improve the usefulness, effectiveness, efficiency, and satisfaction of users, ultimately leading to improved algorithm performance of the biometric system, and thus optimizing the relationship between the human and the sensor. This study aims to address to two questions: What does the biometric sensor need to provide the human and what information does the human need to provide to the sensor for successful interaction. For more information regarding this project, please visit http://www.biotown.purdue.edu/research/ergonomics.asp.

The New Casper: Query Processing for Location Services without Compromising Privacy

Identification, Authentication and Privacy
Mohamed F. Mokbel, Chi-Yin Chow, Walid G. Aref
In this poster, we tackle a major privacy concern in current location-basedservices where users have to continuously report their locations to thedatabase server in order to obtain the service. For example, a user askingabout the nearest gas station has to report her exact location. Withuntrusted servers, reporting the location information may lead to severalprivacy threats. We present Casper; a new framework in which mobile andstationary users can entertain location-based services without revealingtheir location information. Casper consists of two main components, thelocation anonymizer and the privacy-aware query processor. The locationanonymizer blurs the users' exact location information into cloaked spatialregions based on user-specified privacy requirements. The privacy-awarequery processor is embedded inside the location-based database server inorder to deal with the cloaked spatial areas rather than the exact locationinformation. Experimental results show that Casper achieves high qualitylocation-based services while providing anonymity for both data and queries.

The Search for Optimality in Online Intrusion Response for a Distributed E-Commerce System

Incident Detection, Response, and Investigation
Yu-Sung Wu, Gaspar Howard, Matthew Glause, Bingrui Foo, Saurabh Bagchi, Eugene Spafford
Providing automated responses to security incidents in a distributed computing environment has been an important area of research. This is due to the inherent complexity of such systems that makes it difficult to eliminate all vulnerabilities before deployment and costly to rely on humans for responding to incidents in real time.Here we formalize the process of providing automated responses in a distributed system and the criterion for asserting global optimality of the responses. We show that reaching the globally optimal solution is an NP-complete problem. Therefore we design a genetic algorithm framework for searching for good solutions. In the search for optimality, we exploit the similarities among attacks, and use the knowledge learnt from previous attacks to guide future search. The mechanism is demonstrated on a distributed e-commerce system called Pet Store with injection of real attacks and is shown to improve the survivability of the system over the previously reported ADEPTS system.

Towards Dynamically Handling Implicit Information Flow

Assurable Software and Architectures
Bin Xin, Xiangyu Zhang
Implicit Information Flow (IIF) is caused by code not being executed. How to handle it is a long standing open problem. Existing dynamic techniques fail to capture IIF becasue dynamic analyses are typically designed to focus on dynamic information collected from EXECUTED statements. Static techniques are often too conservative since they assume that all program paths and points-to relations are possible. We propose a novel dynamic analysis which handles IIF. The key observation is that IFF is caused by Implicit Dependence, which can be made EXPLICIT by switching branch outcomes of the related conditionals. We have addressed two challenges: one is to align the executions before and after the switch such that implicit dependence can be precisely identified; the other is to reduce the number of implicit dependence probing by sophisticated slicing-based techniques. These provide a solid base for us to completely solve the IIF problem.

Usability of User Agents for Privacy-Preference Specification

Identification, Authentication and Privacy
Robert W. Proctor, M. Athar Ali, Kim-Phuong L. Vu, Dongbin Cho
The goal of this study was to determine users' privacy concerns, whether these privacy concerns can be checked against privacy policies by an existing Web-based privacy agent, and whether users are able to easily specify their privacy preferences using this agent.

Usable Mandatory Integrity Protection for Operating System

Enclave and Network Security
Ninghui Li, Ziqing Mao and Hong Chen
Existing mandatory access control systems for operating systems are difficult to use. We identify several principles for designing usable access control systems and introduce the Usable Mandatory Integrity Protection (UMIP) model that adds usable mandatory access control to operating systems. The UMIP model is designed to preserve system integrity in the face of network-based attacks. The usability goals for UMIP are two-fold. First, configuring a UMIP system should not be more difficult than installing and configuring an operating system. Second, existing applications and common usage practices can still be used under UMIP. UMIP has several novel features to achieve these goals. For example, it introduces several concepts for expressing partial trust in programs. Furthermore, it leverages information in the existing discretionary access control mechanism to derive file labels for mandatory integrity protection. We also discuss our implementation of the UMIP model for Linux using the Linux Security Modules framework, and show that it is simple to configure, has low overhead, and effectively defends against a number of network-based attacks.

Wireless Security Assessment Tool Development

Security Awareness, Education and Training
Nathaniel Husted, Katie Richardson, Stephen Schuler, David Dellacca, Connie Justice
The objective of this project is to create a sustainable and reusable model for assessing the risks of currently installed and/or future installations of wireless local area networks (LAN) in business environments. Due to exponential growth of wireless technology and increasing legislation in related data security issues; the lack of a holistic risk assessment methodology for wireless network security testing provides for an uneven value system to determine the effectiveness of a wireless security implementation. The research is being conducted through literature reviews, study of existing industry-specific methodologies and finally application of the newly developed methodology. This project will culminate with a presentation, written paper and dissemination of the methodology, reporting on the success in the use of specified tools, and guidance to participating businesses of current risks and preventative measures to implement.

Get Your Degree with CERIAS