CERIAS - Center for Education and Research in Information Assurance and Security

Skip Navigation
CERIAS Logo
Purdue University
Center for Education and Research in Information Assurance and Security

Panel 2: NSTIC, Trusted Identities and the Internet (Panel Summary)

Share:
Wednesday, April 3rd, 2013
Panel Members:
  • Cathy Tilton, VP Standards and Technology, Daon Solutions
  • Elisa Bertino, Professor, Computer Science and CERIAS Fellow, Purdue University
  • Stephen Elliot, Associate Professor, Technology Leadership & Innovation and CERIAS Fellow, Purdue University
  • Stuart S. Shapiro, Principal Information Privacy and Security Engineer, The MITRE Corporation
Moderator: Keith Watson, Information Assurance Research Engineer, CERIAS
Summary by Ruchith Fernando

Cathy Tilton was the first to present her views and she opened with an introduction to NSTIC. She mentioned that NSTIC strategy document came out in April 2011 is an outcome the President’s cyber security review.

Daon’s objective is “Enhancing commercial participation cross sector in the identity ecosystem” in collaboration with AARP, PayPal, Purdue University IT Department, American Association of Airport Executives and a major bank.

Daon pilot study consists of 4 components:
  • Technology component: This is based on a risk based multi factor authentication capability solution that leverages mobile devices called IdentityX. Based on the risk level of the transaction, the relying party would dynamically invoke some combination of authentication methods.
  • Research component: Deon teamed with the Purdue Biometric Lab in analyzing data coming from operational pilots to evaluate usability, accessibility, privacy, security, user acceptance and performance of the solution in various environments.
  • Trust frameworks: A research effort attempting to identify what gaps exists in trying to fit IdentityX solution to existing trust frameworks.
  • Operational Pilot: There are 5 relying parties from different sectors. Some with large and smaller subscriber bases implementing different use cases.

Professor Stephen Elliot presented his work at the Purdue Biometrics Laboratory where they have been working with a focus on testing and evaluation various biometrics since 2001. There is a multi faceted approach to the testing philosophy in this project which involves “in lab” testing, surveys and “in the wild” testing.

In lab testing is where there is a controlled environment where users carry out controlled transactions. These tests are carried out on three different operating systems, and evaluate interoperability by assessing whether users remember how to use the device and whether they can transfer that knowledge into another operating system. These sessions are recorded and are conducted over 4 to 6 weeks.

In the wild testing attempts to mimic various real life scenarios where the test subjects are given a mobile device for a month. The focus groups involved in testing include elderly, disabled and able-bodied individuals where there are about 10 to 15 participants in each group.

Professor Elisa Bertino was the next to present her views. She defined digital identity and introduced the concepts of strong identifiers and weak identifiers. Strong identifiers identify an individual uniquely. Weak identifiers are those that do not identify a person uniquely. Depending on the context an identifier may be a strong or a weak identifier.

Security and interoperability are concerns: In most identity management systems, the user is redirected to an identity provider when authenticating with a relying party. But this leads to privacy issues where the identity provider learns information about the user’s transactions. Protocols developed in VeryIDX project uses an identity token given to the user by the identity provider, which can be used without further interactions with the identity provider. Those protocols are very different with different information and different interaction models. Therefore achieving interoperability with other protocols is a challenge.

Linkability: When a user carries out two transactions with two different relying parties, the two relying parties may be able to use information they collect to identify that they are interacting with the same user. This further applies to the user carrying out two transactions with the same relying party.

Stuart S. Shapiro expressed his views on two main issues.

NSTIC This promotes selective attribute disclosure. In the case where individual subscribes to an online newspaper, from the subscriber’s privacy perspective, as long as the service provider can verify that he/she is a valid subscriber there is no need for any other identity information. But based on the business model, the service provider may need to know certain demographic data about the individual to be able to target advertisements and to be able to charge for advertisements. This is the issue of “Functional Minimums vs. Business Model Minimums”. Business models may require much more information than what is covered in functional requirements. NSTIC does not clearly address this issue.

Service providers are “not interested in individuals but are interested in categories”: This categorization may be either benign or harmful to individuals. Therefore even with privacy preservation techniques, if an individual can be categorized, this might lead to the leak of critical identity information.

This was the conclusion of the presentations by the panelists and the audience raised several questions:

Q: FIPS 2001 seems to be very applicable to your study.

Cathy Tilton: With regard to FIPS 2001, we are not doing anything with regard to smart card credentials. We are however looking at the ICAM (Identity, Credential, and Access Management) certification related to trust framework providers and identity providers.

Q: I assume you are trying to certify the protection of data bearing processes. Please explain how you are doing it on inherently insecure mobile device without a trust anchor.

Cathy Tilton: We include a private key in the keychain of the phone. We consider the phone an untrusted device. We use the key to set up a mutually authenticated TLS session with the phone. When this secure channel is established one can use this channel to collect information on the phone and send it back to the server where verification is performed.

Q: What about jailbroken devices?

Cathy Tilton: The device is considered untrusted. When secure elements are commercially available on mobile devices we will make use of those. Our approach has been BYOD (bring your own device) model for usability and familiarity. Therefore we have to manage with the capabilities and security features of those devices.

Q: How does liability fit into NSTIC?

Cathy Tilton: In our situation we are really more of a credential provider. Therefore it is a shared responsibility. Most of our relying parties already have all their identity information about their subscriber base. They do not share that information with us and they do the identity proofing. What they are looking from us is a strong credential. What becomes critical is the binding of the credential to the identity.

Prof. Bertino: The problem of liability is a very debated question. In software engineering who is liable for software mistakes? Identity management is very much the same. It is software, which needs to be secured to be working properly.

Q: How does NSTIC accommodate potential issues such as forcing users under duress to authenticate sensitive transactions?

Cathy Tilton: In our solution we have provisions to a “duress pin” where the relying party handles it according to their policy.

Stuart S. Shapiro: In a certain context duress is the status quo right now. In some cases, users lie about the requested information to obtain the service and avoid providing real identity information. If the certified attributes are required then there will be no option to lie. In such cases duress can increase rather than decrease.

Q: How do you see we achieve tradeoff between the fact that we have to reveal certain information about ourselves while certain generalized categories already reveal so much information about us?

Prof. Bertino: Services should provide customers the choice of revealing information and provide alternatives such as paying for those services. Systems should be flexible to support such options. Sometimes categorization can be benign, but for example, if a user is in the wrong passenger profile then he/she might have trouble getting through airport security. Even if a user is a very private user but he/she simply possess a certain feature in a population he/she might be automatically classified. I think this is separate problem and I don’t think NSTIC has to solve this.

Cathy Tilton: NSTIC from the supports both pseudonymous and anonymous credentials since there are many transactions that do not require any more information.

Q: How is the biometric data stored on the server? Is there anything equivalent to secure password hashes for biometrics data?

Cathy Tilton: All biometrics protected using all mechanisms that are normally used to protect data at rest such as encryption and audited access behind a firewall. If the cryptographic mechanisms fail, certain biometrics can leak information. But biometrics are not stored alongside identity information. When acting as a credential provider we have no identifying information associated with biometrics.

Q: What are the methods available to verify aliveness of a subject.

Cathy Tilton: Our aliveness support includes using photographs of different angles of a face and a challenge response mechanism with random and longer phrases for voice authentication. We are working on using video as well.

Q: What are interesting open research problems?

Prof. Elliot: There are many research problems with a lot of challenges in areas such as mobile usability testing.

Prof. Bertino: There’s a lot of work to be done in the anonymity techniques for digital identity and linkability analysis.

Stuart S. Shapiro: More sophisticated privacy risk modeling techniques are required. Need techniques for integrating privacy in an engineering sense.

Comments

Leave a comment

Commenting is not available in this section entry.