The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Reports and Papers Archive


Browse All Papers »       Submit A Paper »

Analysis of port scanning attacks

CERIAS TR 2009-33
Yu Zhang
Download: PDF
Added 2012-12-11

Architectural approaches for code injection defense at the user and kernel levels

CERIAS TR 2009-34
Riley, Ryan
Download: PDF
Added 2012-12-11

Efficient query processing for rich and diverse real-time data

CERIAS TR 2009-35
Nehme, Rimma
Download: PDF
Added 2012-12-11


Forensic characterization of image capture devices

CERIAS TR 2009-38
Nitin Khanna
Download: PDF
Added 2012-12-11

Analysis of access control policies in operating systems

CERIAS TR 2009-37
Hong Chen
Download: PDF
Added 2012-12-11

A multi-layer approach towards high-performance wireless mesh networks

CERIAS TR 2007-107
Das, Saumitra
Download: PDF
Added 2012-12-11

Adaptive Virtual Distributed Environments for Shared Cyberinfrastructures

CERIAS TR 2007-108
Ruth, Paul
Download: PDF
Added 2012-12-11

Mitigation of control and data traffic attacks in wireless ad-hoc and sensor networks

CERIAS TR 2007-109
Issa Khalil
Download: PDF
Added 2012-12-11

An examination of user behavior for user re-authentication

CERIAS TR 2007-110
Pusara, Maja
Download: PDF
Added 2012-12-11

Privacy-preserving Access Control

CERIAS TR 2012-13
Zahid Pervaiz, Walid G. Aref, Arif Ghafoor, and Nagabhushana Prabhu
Download: PDF

Access control mechanisms protect sensitive information from unauthorized users. However, when sensitive information is shared and a Privacy Protection Mechanism (PPM) is not in place, an authorized insider can still compromise the privacy of a person leading to identity disclosure. A PPM can use suppression and generalization to anonymize and satisfy privacy requirements, e.g., k-anonymity and l-diversity, against identity and attribute disclosure. However, the protection of privacy is achieved at the cost of precision of authorized information. In this paper, we propose a privacy-preserving access control framework. The access control policies define selection predicates available to roles while the privacy requirement is to satisfy the k-anonymity or l-diversity. An additional constraint that needs to be satisfied by the PPM is the imprecision bound for each selection predicate. The techniques for workload-aware anonymization for selection predicates have been discussed in the literature. However, to the best of our knowledge, the problem of satisfying the accuracy constraints for multiple roles has not been studied before. In our formulation of the aforementioned problem, we propose heuristics for anonymization algorithms and show empirically that the proposed approach satisfies imprecision bounds for more permissions and has lower total imprecision than the current state of the art.

Added 2012-10-02

Privacy Preserving Access Control on Third-Party Data Management Systems

CERIAS TR 2012-12
Mohamed Nabeel
Download: PDF

The tremendous growth in electronic media has made publication of information in either open or closed environments easy and effective. However, most application domains (e.g. electronic health records (EHRs)) require that the fine-grained selective access to information be enforced in order to comply with legal requirements, organizational policies, subscription conditions, and so forth. The problem becomes challenging with the increasing adoption of cloud computing technologies where sensitive data reside outside of organizational boundaries. An important issue in utilizing third party data management systems is how to selectively share data based on finegrained attribute based access control policies and/or expressive subscription queries while assuring the confidentiality of the data and the privacy of users from the third party.

In this thesis, we address the above issue under two of the most popular dissemination models: pull based service model and subscription based publish-subscribe model. Encryption is a commonly adopted approach to assure confidentiality of data in such systems. However, the challenge is to support fine grained policies and/or expressive content filtering using encryption while preserving the privacy of users. We propose several novel techniques, including an efficient and expressive group key management scheme, to overcome this challenge and construct privacy preserving dissemination systems.

Added 2012-09-04

Practical Automatic Determination of Causal Relationships in Software Execution Traces

CERIAS TR 2011-24
Sundararaman Jeyaraman
Download: PDF

From the system investigator who needs to analyze an intrusion (“how did the intruder break in?”), to the forensic expert who needs to investigate digital crimes (“did the suspect commit the crime?”), security experts frequently have to answer questions about the cause-effect relationships between the various events that occur in a computer system. The implications of using causality determination techniques with a low accuracy vary from slowing down incident response to undermining the evidence unearthed by forensic experts.

This dissertation presents research done in two areas: (1) We present an empirical study evaluating the accuracy and performance overhead of existing causality determination techniques. Our study shows that existing causality determination techniques are either accurate or efficient, but seldom both. (2) We propose a novel approach to causality determination based on coarse-grained observation of control-flow of program execution. Our evaluation shows that our approach is both practical in terms of low runtime overhead and accurate in terms of low false positives and false negatives.

Added 2012-08-03

Privacy Preserving Delegated Access Control in Public Clouds

CERIAS TR 2012-11
Mohamed Nabeel, Elisa Bertino
Download: PDF

Current approaches to enforce fine-grained access control on confidential data hosted in the cloud are based on fine-grained encryption of the data. Under such approaches, data owners are in charge of encrypting the data before uploading them on the cloud and re-encrypting the data whenever user credentials or authorization policies change. Data owners thus incur high communication and computation costs. A better approach should delegate the enforcement of fine-grained access control to the cloud, so to minimize the overhead at the data owners, while assuring data confidentiality from the cloud. We propose an approach, based on two layers of encryption, that addresses such requirement. Under our approach, the data owner performs a coarse-grained encryption, whereas the cloud performs a fine-grained encryption on top of the owner encrypted data. A challenging issue is how to decompose access control policies (ACPs) such that the two layer encryption can be performed.We show that this problem is NP-complete and propose novel optimization algorithms. We utilize an efficient group key management scheme that supports expressive ACPs. Our system assures the confidentiality of the data and preserves the privacy of users from the cloud while delegating most of the access control enforcement to the cloud.

Added 2012-07-17

Privacy Risk and Scalability of Differentially-Private Data Anonymization

CERIAS TR 2012-10
Mohamed R. Fouad
Download: PDF

Although data disclosure is advantageous for many obvious reasons, it may incur some risk resulting from potential security breaches. An example of such privacy violation occurs when an adversary reconstructs the original data using additional information. Moreover, sharing private information such as address and telephone number in social networks is always subject to a potential misuse. In this dissertation, we address both the scalability and privacy risk of data anonymization. We develop a framework that assesses the relationship between the disclosed data and the resulting privacy risk and use it to determine the optimal set of transformations that need to be performed before data is disclosed. We propose a scalable algorithm that meets differential privacy when applying a specific random sampling.

The main contribution of this dissertation is three-fold: (i) we show that determining the optimal transformations is an NP-hard problem and propose a few approximation heuristics, which we justify experimentally, (ii) we propose a personalized anonymization technique based on an aggregate (Lagrangian) formulation and prove that it could be solved in polynomial time, and (iii) we show that combining the proposed aggregate formulation with specific sampling gives an anonymization algorithm that satisfies differential privacy. Our results rely heavily on exploring the supermodularity properties of the risk function, which allow us to employ techniques from convex optimization. Finally, we use the proposed model to assess the risk of private information sharing in social networks.

Through experimental studies we compare our proposed algorithms with other anonymization schemes in terms of both time and privacy risk. We show that the proposed algorithm is scalable. Moreover, we compare the performance of the proposed approximate algorithms with the optimal algorithm and show that the sacrifice in risk is outweighed by the gain in efficiency.

Added 2012-07-04