Fingerprint verification is a commonly used modality in biometric identification and as such, Biometric fingerprint verification continues to be incorporated into many facets of world-wide society it is prudent that multiple factors in the image acquisition by accepted as ‘industry standards’ to facilitate and ensure the information security community can seamlessly integrate technologies.
There is also a need for both better understanding how fingerprint recognition system can better match against multiple fingers without creating the potential for greater security holes.
The impetus for this paper is the better understand the how combinations of multiple fingers affect match scores against a common thresh-hold and to determine, if one exists, an optimal number and combination of fingers to match against to create the lost possibility to both false accepts and false rejects.
As the convergence between our physical and digital worlds continues at a rapid pace, much of our information is becoming available online. In this paper we develop a novel taxonomy of methods and techniques that can be used to protect digital information. We discuss how information has been protected and show how we can structure our methods to achieve better results. We explore complex relationships among protection techniques ranging from denial and isolation, to degradation and obfuscation, through negative information and deception, ending with adversary attribution and counter-operations. We present analysis of these relationships and discuss how can they be applied at different scales within organizations. We also identify some of the areas that are worth further investigation. We map these protection techniques against the cyber kill-chain model and discuss some findings.
Moreover, we identify the use of deceptive information as a useful protection method that can significantly enhance the security of systems. We posit how the well-known Kerckhoffs’s principle has been misinterpreted to drive the security community away from deception-based mechanisms. We examine advantages these techniques can have when protecting our information in addition to traditional methods of hiding and hardening. We show that by intelligently introducing deceptive information in information systems, we not only lead attackers astray, but also give organizations the ability to detect leakage; create doubt and uncertainty in any leaked data; add risk at the adversaries’ side to using the leaked information; and significantly enhance our abilities to attribute adversaries. We discuss how to overcome some of the challenges that hinder the adoption of deception-based techniques and present some recent work, our own contribution, and some promising directions for future research.
Due to resource constraints, unattended operating environment, and communication phenomena, Wireless Sensor Networks (WSNs) are susceptible to operational failures and security attacks. However, WSNs must be able to continuously provide their services despite anomalies or attacks and to effectively recover from attacks. In this paper, we propose Kinesis - the first systematic approach to a security incident response and prevention system for WSNs. We take a declarative approach to support the specification of the response policies, based on which Kinesis selects the response actions. The system is distributed in nature, dynamic in actions depending on the context, quick and effective in response, and secure. We implement Kinesis in TinyOS. Testbed experiments and extensive TOSSIM simulations show that the system successfully counteracts anomalies/attacks and behaves consistently under various attack scenarios and rates.
Today’s large datasets are a major hindrance on digital investigations and have led to a substantial backlog of media that must be examined. While this media sits idle, its relevant investigation must sit idle inducing investigative time lag. This study created a client/server application architecture that operated on an existing pool of internally networked Windows 7 machines. This distributed digital forensic approach helps to address scalability concerns with other approaches while also being financially feasible. Text search runtimes and match counts were evaluated using several scenarios including a 100 GB image with prefabricated data. When compared to FTK 4.1, a 125 times speed up was experienced in the best case while a three times speed up was experienced in the worst case. These rapid search times nearly irrationalize the need to utilize long indexing processes to analyze digital evidence allowing for faster digital investigations.
The movement of people and goods around the world is certainly no easy task and is heavily reliant on the intricate maneuvering of systems, processes, people and more than ever, technology. These systems to move the shipment of goods and transportation of people are critical to the world economy; particularly to the over 315 million people living in the United States. The importance of shipping and transportation infrastructure to the United States is reflected in the inclusion of this infrastructure on the Cyber Infrastructure and Key Resources (CIKR) list created and maintained by the United States federal government. According to the Bureau of Transportation from 2009, this multi-trillion dollar industry consists of maritime, aviation, and ground transportation systems including road and railways; all of which are at the core of transportation operations. The world’s growing reliance on advanced information technology has introduced opportunities for cyber attacks that exploit vulnerabilities in the information technology enabled systems. This report analyzes documented cyber attacks on the shipping and transportation industry and discusses the potential impacts of that analysis on U.S. shipping and transportation infrastructure.
A hybrid signcryption approach can efficiently encapsulate new keys and securely transmit data for various applications such as Advanced Metering Infrastructures (AMIs) and Wireless Sensor Networks (WSNs). However, since most hybrid signcryption approaches rely on traditional PKI using a certificate trusted by CA, they require the management of certificates. Although Identity-based Public Key Cryptography (ID-PKC) was introduced to eliminate the dependency from explicit certificates, it suffers from a key escrow problem because the Key Generation Center (KGC) stores the private keys of all users. In order to resolve these drawbacks, certificateless public key cryptography (CL-PKC) was introduced, that splits the user’s private key into two parts: one is a partial private key generator by the KGC, and the other one is a secret value selected by the user. CL-PKC is able to overcome the key escrow problem because the KGC is unable to access the user’s secret value. Only when a valid user holds both the partial private key and the secret value, the cryptographic operations such as decryption or digital signing based on CL-PKC can be performed.
Recently,the concept of certificateless hybrid signcryption (CL-HSC) evolved by combining the ideas of signcryption based on tag-KEM and certificateless cryptography. However, existing CL-HSC schemes are not secure against existential forgery attack and are constructed by utilizing bilinear pairing operations. In spite of the recent advances in implementation techniques, the computational cost required for pairing operation is still considerably higher in comparison to standard operations such as ECC point multiplication. In this technical report, we propose a elliptic curve cryptography based certificateless hybrid signcryption (CL-HSC) scheme without pairing operations. We present the formal security model of our CL-HSC scheme. Then, we provide the security proof of our CL-HSC scheme against both adaptive chosen ciphertext attack and existential forgery in the appropriate security models for certificateless hybrid signcryption. Since our CL-HSC scheme does not depend on the pairing-based operation, it reduces the computational overhead. It is also adopted to utilize ECC (Elliptic Curve Cryptography). Thus, we take the benefit of ECC keys defined on an additive group with a 160-bit length as secure as the RSA keys with 1024-bit length.
This study is a part of the smart grid initiative providing electric vehicle charging infrastructure. It is a refueling structure, an energy generating photovoltaic system and charge point electric vehicle charging station. The system will utilize advanced design and technology allowing electricity to flow from the site’s normal electric service to the charging stations or for the solar system to provide available electricity to fuel the cars reducing the cost of the driver’s fuel bills. In case where the charging station do not need the electricity produced by the solar system, the excess electricity will be given back for use in the host facility or net metered back to the utility and credited to the meter owner’s monthly electric bill. Therefore our proposed system is capable of providing frequency and regulation services, harnessing renewable source of energy, providing both AC and DC charging options to the electric vehicle owners, in addition to be able to function as a micro-grid. ^ The above mentioned tasks cannot be completed without an IT communication infrastructure in place. As all these components will be connected together with a mesh of networks, it is without doubt that there will be concerns regarding to the security of overall structure more in terms of the information that will be passed through the networks for different purposes. The security and privacy of this information becomes all the more important as it is concerned with the safety and lives of everyone associated with this structure. ^ Thus the objective of this study is to look and understand the various processes and domains that our system will have and how those domains or in other words the infrastructure of our system can be secured. In our study we will build a logical security architecture based on the Sherwood Applied Business Security Architecture model and draw conclusions whether the said model can be applied in more scenarios like ours in the future.^
The prevalent need for publicly available datasets, coupled with the spate of privacy-related incidents pertaining to the release of such data, have spurred the need to develop resilient and accurate methods of privacy-preserving data publishing. In this dissertation, we consider the problem of private data publishing while satisfying the robust notion of differential privacy. In particular, we consider the scenario in which a trusted curator gathers sensitive information from a large number of respondents, creates a dataset where each tuple corresponds to one entity, and publishes a privacy-preserving synopsis of the dataset. The diverse nature of the datasets of interest prevents the development of a single general method of data publishing that works in all situations. We therefore develop differentially private synopsis mechanisms for various types of data. We start with the simplest data publishing scenario: publishing a single-dimensional histogram. We explore hierarchical approaches to publishing histograms and propose various optimizations. Next we consider two-dimensional datasets and propose grid-based approaches for publishing geospatial datasets. For datasets with more than just a few dimensions, we propose a framework for publishing k -way marginals and contingency tables while guaranteeing accuracy and consistency. Furthermore, for high-dimensional datasets, we propose a framework for frequent itemset mining while guaranteeing differential privacy. Finally, we explore relaxations to differential privacy in light of an adversary’s uncertainty about the dataset.
The oil & gas industry is a multibillion-dollar industry that has a history of conflict. As modern technology has developed, both the corporate aspects and technical aspects of the oil & gas industry have become heavily reliant on the Cyber domain. The inherently insecure origins and evolution of computing has led that dependence to become a severe vulnerability. Recent events have brought this fact to light with a deluge of “cyber attacks” launched globally against the industry. These attacks raise specter of cyber conflict and the question of culpabil- ity. This report seeks to analyze a selection of these events, looking for patterns that would indicate one or more advanced actors. By observing the motives means and opportunities presented to actors, and looking at a cross section of these attacks over time, conclusions will be drawn as to the past, present, and future of cyber conflict within the industry.
Access control mechanisms and Privacy Protection Mechanisms (PPM) have been proposed for data streams. The access control for data stream allows roles access to tuples satisfying an authorized predicate sliding-window query. When the sensitive stream data is shared without a PPM the privacy can be compromised. The PPM meets privacy requirements, e.g., k-anonymity or l-diversity by generalization of stream data. This imprecision introduced by generalization can be reduced by delaying the publishing of stream data. However, the delay in sharing the stream tuples to achieve better accuracy can lead to false negatives if the tuples slide out of the window when a sliding-window query predicate is evaluated for access control mechanism. To set a threshold on the loss of precision, access control mechanism defines the imprecision bound for each query. The challenge is to optimize the time duration for which the stream data is held by PPM so that the imprecision bounds for the maximum possible number of queries are met. In our formulation of the aforementioned problem we present the hardness results, provide an anonymization algorithm, and conduct experimental evaluation of the proposed algorithm. Experiments demonstrate that the proposed heuristic provides better precision as compared to the minimum or maximum delay heuristics.
In April 2007, what has been incorrectly called the first cyber war and since then referred more correctly as a cyber riot, an attack on the domain name systems and the various servers of Estonia occurred. It was perpetrated by ethnic Russians living in Estonia who were incensed by the movement of a bronze war memorial for Russian soldiers to a grave yard from the center of town (Evron, 2008, p. 122). The cyber riot was nearly simultaneous with the actual real world riots. This brought the idea of cyber warfare and conflict in cyberspace into the public view. ^ It may be better to replace the idea of asymmetric threats with the idea of strategic threats as the threat may not be asymmetric (e.g. large scale cyber warfare) but the adversary’s strategy is asymmetric in effort. If according to Clausewitz, we are willing to accept that there are two primary things needed for war (politically opposed will-power and the ability to field a military) then a missing element of cyber warfare may be the ability to field a military (Szafranski, 1990, p. 39). ^ This research states, “Given the unstructured domain of cyber warfare knowledge a specific model will allow experts to produce a concept map significantly more detailed than absent the model.” Experts were solicited in a variety of venues to map cyber warfare using a concept mapping process and provide a deeper understanding of the concept. Two technology-centric models were given to groups of experts to assist them in explaining elements of cyber conflict. One group was just given the cyber warfare question and no specific model as guidance. The groups were then compared to see if either of the models had better explanatory power per the experts responses.^
The tremendous growth in electronic media has made publication of information in either open or closed environments easy and effective. However, most application domains (e.g. electronic health records (EHRs)) require that the fine-grained selective access to information be enforced in order to comply with legal requirements, organizational policies, subscription conditions, and so forth. The problem becomes challenging with the increasing adoption of cloud computing technologies where sensitive data reside outside of organizational boundaries. An important issue in utilizing third party data management systems is how to selectively share data based on fine-grained attribute based access control policies and/or expressive subscription queries while assuring the confidentiality of the data and the privacy of users from the third party. ^ In this thesis, we address the above issue under two of the most popular dissemination models: pull based service model and subscription based publish-subscribe model. Encryption is a commonly adopted approach to assure confidentiality of data in such systems. However, the challenge is to support fine grained policies and/or expressive content filtering using encryption while preserving the privacy of users. We propose several novel techniques, including an efficient and expressive group key management scheme, to overcome this challenge and construct privacy preserving dissemination systems.^
Ad hoc wireless networks are vulnerable to a wide range of security attacks, due to the ease of the nodes being compromised and the cooperative nature of these networks. A solution approach widely used for defending these networks is behavior-based detection. In this, nodes overhear communications in their neighborhood exploiting the open nature of the wireless medium, and determine if the behaviors of their neighbors are legitimate. An important issue with behavior-based detection that arises in multi-channel ad hoc wireless networks is on which channels monitoring nodes should overhear their neighbors’ communications. ^ In this dissertation, we develop a framework for behavior-based detection in multi-channel ad hoc wireless networks. We are interested in the issue of how to optimally place monitoring nodes and to select channels to tune their radios to. We show that the problem is NP-hard, then develop approximation algorithms. We show that one of our algorithms attains the best approximation ratio achievable among all polynomial-time algorithms. Also, we develop distributed channel assignment algorithms for large-scale and dynamic networks. The distributed nature of the algorithm allows it to scale to large networks. Further, we allow for imperfect detection, where monitoring nodes may probabilistically fail to detect malicious behaviors. For this scenario, we consider providing multiple covers to each node, thereby still maintaining the detection accuracy above a certain level. We evaluate our algorithms for random and scale-free networks and consider optimizations for practical deployment scenarios, such as when the network configuration is changing fast versus a relatively static network.^
Query-based web search is becoming an integral part of many people’s daily activities. Most do not realize that their search history can be used to identify them (and their interests). In July 2006, AOL released an anonymized search query log of some 600K randomly selected users. While valuable as a research tool, the anonymization was insufficient: individuals could be identified from the contents of the queries alone. We propose a client-centered approach based on deniable search: actual user queries are replaced with a set of k queries that hide the actual query. We formalize the definition of Deniable Search and develop two complementary schemes that achieve deniable privacy for web search. ^ In the first approach of Plausibly Deniable Search, cover queries have characteristics similar to the user query but on unrelated topics. The system ensures that any of these k queries will produce the same set of k queries, giving k possible topics the user could have been searching for. We also investigate a system for generating queries automatically by creating clusters of a query log in the semantic space. In the second approach, we ensure that sequences of queries (and cover queries) retain deniability. Real user queries share terms and topics in a sequence as the users refine queries. We extract features from user query logs and use them to generate cover query sequences that have similar properties. We evaluate the methods using a large query log and DMOZ categorized webpages. ^
This dissertation investigates the problem of developing models and mechanisms for user authorization based on uncertain evidence and dynamic trust. The research directions are modeling dynamic trust, modeling uncertain evidence, and integrating them with role assignment for user authorization. ^ A computational trust model rooted in findings from social science is developed. Trusting belief, disposition to trust, and context are represented. Trusting belief in competence is distinguished from that in integrity. Trusting beliefs can be built based on direct experience, reputation, most common belief, and a priori belief. The building process is affected by a truster’s disposition to trust. This research concentrates on developing (1) competence reputation aggregation methods that eliminate the impact of subjectivity and (2) the integrity belief formation method that captures the behavior trend. ^ An evidence statement is designed to enable issuers or parties relying on evidence to express uncertain belief about the evidence. An opinion structure is used to measure the uncertainty. An evidence rewriting rule is developed based on the discounting operator proposed by Shafer. It is used to translate an issuer’s uncertain belief to the belief of the evidence relying party. Three inference methods are devised to address the opinion inference problem raised in user authorization. ^ A role assignment policy definition language is developed. It allows a policy maker to express the evidence and dynamic trust requirements for assigning a role. An evaluation semantic is defined to make role-assignment decisions based on the provided evidence set and policies. A role assignment algorithm is developed based on it. An authorization framework called TERA is developed. It integrates uncertain evidence, dynamic trust, and role assignment, and cooperates with RBAC-enhanced application servers for authorization in open environments. A prototype has been implemented.