Reports and Papers Archive

A Provenance-Aware Multi-Dimensional Reputation System For Online Rating Systems

CERIAS TR 2017-2

Online rating systems are widely accepted as means for quality assessment on the web and users increasingly rely on these systems when deciding to purchase an item online. This makes such rating systems frequent targets of attempted manipulation by posting unfair rating scores. Therefore, providing useful, realistic rating scores as well as detecting unfair behavior are both of very high importance. Existing solutions are mostly majority based, also employing temporal analysis and clustering techniques. However, they are still vulnerable to unfair ratings. They also ignore distances between options, the provenance of information and different dimensions of cast rating scores while computing aggregate rating scores and trustworthiness of users. In this paper, we propose a robust iterative algorithm which leverages information in the profile of users and provenance of information and which takes into account the distance between options to provide both more robust and informative rating scores for items and trustworthiness of users. We also prove convergence of iterative ranking algorithms under very general assumptions which are satisfied by the algorithm proposed in this paper. We have implemented and tested our rating method using both simulated data as well as four real-world datasets from various applications of reputation systems. The experimental results demonstrate that our model provides realistic rating scores even in the presence of a massive amount of unfair ratings and outperforms the well-known ranking algorithms.

Automated Differential Testing for Energy-Efficient Control Software

CERIAS TR 2018-01

Cyber-physical systems (CPS) are integrated systems of computer-based algorithms and physical components interacting with environmental effects. In such systems, autonomous behaviors and overall performance mainly depend on a control software. Thus, it is crucial to test and analyze the control software of the CPS in various perspectives. One of the critical perspectives is energy efficiency because many cyber-physical systems (e.g. unmanned aerial vehicles, autonomous cars, health-care devices) operate with limited energy sources such as batteries. In this paper, we propose CPSDiff: an energy-aware differential testing framework that generates test inputs to expose the maximal difference between two control programs in energy consumption. Our test generation technique uses meta-heuristic searching to find the input that maximizes the energy consumption difference. The difference-revealing ability of our technique outperforms the random search algorithm and hill-climbing search algorithm. Our evaluation of two popular unmanned aerial vehicle control programs provides a detailed comparison of their energy consumption under the same condition with a universal robotics simulator; CPSDiff found the input which exposes maximum battery consumption difference of around 47%.

Privacy-Preserving Analysis with Applications to Textual Data

CERIAS TR 2017-01

Textual data plays a very important role in decision making and scientific research, but cannot be shared freely if they contain personally identifiable information. In this dissertation, we consider the problem of privacy-preserving text analysis, while satisfying a strong privacy definition of differential privacy.

We first show how to build a two-party differentially private secure protocol for computing similarity of text in the presence of malicious adversaries.  We then relax the utility requirement of computational differential privacy to reduce computational cost, while still giving security with rational adversaries.

Next, we consider the problem of building a data-oblivious algorithm for minimum weighted matching in bipartite graphs, which has applications to computing secure semantic similarity of documents. We also propose a secure protocol for detecting articulation points in graphs. We then relax the strong data-obliviousness definition to give $\epsilon$-data-obliviousness based on the notion of indistinguishability, which helps us to develop efficient protocols for data-dependent algorithms like frequent itemset mining.

Finally, we consider the problem of privacy-preserving classification of text. A main problem in developing private protocols for unstructured data is high dimensionality. This dissertation tackles high dimensionality by means of differentially private feature selection. We show that some of the well known feature selection techniques perform poorly due to high sensitivity and we propose techniques that perform well in a differential private setting. The feature selection techniques are empirically evaluated using differentially private classifiers: na\”{i}ve Bayes, support vector machine and decision trees.

Securing Cloud-Based Data Analytics: A Practical Approach

CERIAS TR 2016-9

The ubiquitous nature of computers is driving a massive increase in the amount of data generated by humans and machines. The shift to cloud technologies is a paradigm change that offers considerable financial and administrative gains in the ef- fort to analyze these data. However, governmental and business institutions wanting to tap into these gains are concerned with security issues. The cloud presents new vulnerabilities and is dominated by new kinds of applications, which calls for new security solutions. In the direction of analyzing massive amounts of data, tools like MapReduce, Apache Storm, Dryad and higher-level scripting languages like Pig Latin and DryadLINQ have significantly improved corresponding tasks for software devel- opers. The equally important aspect of securing computations performed by these tools and ensuring confidentiality of data has seen very little support emerge for programmers. In this dissertation, we present solutions to a. secure computations being run in the cloud by leveraging BFT replication coupled with fault isolation and b. se- cure data from being leaked by computing directly on encrypted data. For securing computations (a.), we leverage a combination of variable-degree clustering, approx- imated and offline output comparison, smart deployment, and separation of duty to achieve a parameterized tradeoff between fault tolerance and overhead in prac- tice. We demonstrate the low overhead achieved with our solution when securing data-flow computations expressed in Apache Pig, and Hadoop. Our solution allows assured computation with less than 10 percent latency overhead as shown by our evaluation. For securing data (b.), we present novel data flow analyses and program xi transformations for Pig Latin and Apache Storm, that automatically enable the ex- ecution of corresponding scripts on encrypted data. We avoid fully homomorphic encryption because of its prohibitively high cost; instead, in some cases, we rely on a minimal set of operations performed by the client. We present the algorithms used for this translation, and empirically demonstrate the practical performance of our approach as well as improvements for programmers in terms of the effort required to preserve data confidentiality.

Hardware Accelerated Lattice Based Authentication

The Internet of Things (IoT) is growing at a rapid pace and it is essential that the communication between these resource constrained IoT devices is secure and efficient. Traditional authentication schemes like RSA and ECDSA do not conform to the high throughput and time critical needs of these devices. In this research, we propose a hardware accelerated lattice based authentication scheme tailored specifically towards constrained IoT devices. We show that our scheme provides lower latency and higher throughput compared to traditional authentication schemes and its CPU only counter-parts.

Convicted by Memory: Automatically Recovering Spatial-Temporal Evidence from Memory Images

Memory forensics can reveal “up to the minute” evidence of a device’s usage, often without requiring a suspect’s password to unlock the device, and it is oblivious to any persistent storage encryption schemes, e.g., whole disk encryption. Prior to my work, researchers and investigators alike considered data-structure recovery the ultimate goal of memory image forensics. This, however, was far from sufficient, as investigators were still largely unable to understand the content of the recovered evidence, and hence efficiently locating and accurately analyzing such evidence locked in memory images remained an open research challenge. In this dissertation, I propose breaking from traditional data-recovery-oriented forensics, and instead I present a memory forensics framework which leverages pro- gram analysis to automatically recover spatial-temporal evidence from memory im- ages by understanding the programs that generated it. This framework consists of four techniques, each of which builds upon the discoveries of the previous, that repre- sent this new paradigm of program-analysis-driven memory forensics. First, I present DSCRETE, a technique which reuses a program’s own interpretation and rendering logic to recover and present in-memory data structure contents. Following that, VCR developed vendor-generic data structure identification for the recovery of in-memory photographic evidence produced by an Android device’s cameras. GUITAR then re- alized an app-independent technique which automatically reassembles and redraws an app’s GUI from the multitude of GUI data elements found in a smartphone’s memory image. Finally, different from any traditional memory forensics technique, ix RetroScope introduced the vision of spatial-temporal memory forensics by retarget- ing an Android app’s execution to recover sequences of previous GUI screens, in their original temporal order, from a memory image. This framework, and the new program analysis techniques which enable it, have introduced encryption-oblivious forensics capabilities far exceeding traditional data-structure recovery.