Howard Schmidt, Special Assistant to the President and Senior Director for Cyber Security, Office of the U.S. President
Morning Keynote Address, April 4, 2012.
Summary by Keith Watson
In the introduction, Professor Spafford mentioned many of the roles that Howard Schmidt has had over his many years in the field. He specifically highlighted Mr. Schmidt’s service to the nation.
He also indicated that things in information security are not necessarily better since Howard last attended the CERIAS Symposium in 2004, but that was not Howard’s fault.
Howard Schmidt began his keynote address by thanking the staff and faculty associated with CERIAS for their efforts. Mr. Schmidt disagreed with Spafford regarding his opening comment about things not being better since his last visit. “The system works,” he said. It is fraught with issues with which we have to manage. Mr. Schmidt indicated that there are many things that we can do online that we were not able to do twenty years ago. We can make it work better though. We have bigger threats and more vulnerabilities due to increased accessibility, but it works. We have to make it work better.
In 2008 when then Senator Obama visited Purdue, he talked about emerging technologies and cybersecurity. He stated, “Every American depends — directly or indirectly — on our system of information networks. They are increasingly the backbone of our economy and our infrastructure; our national security and our personal well-being.” We take technology infrastructure for granted, and we must ensure that it continues to be available.
One of the issues discussed in the government today, is reducing the likelihood that new generations of victims are created. We need cybercrime prevention. Then law enforcement agencies have a better opportunity at scaling up to deal with the issue. Currently, law enforcement can only focus on the most egregious crimes. The FBI is moving cyber crime moving up on their priority list. They are looking at cyber crime internationally.
An estimated $8 trillion were exchanged over wired and wireless networks last year. Online shopping increased even in a down economy.
The President has promised to make cyber infrastructure a strategic national asset. He has called on all of us to look ahead and design and build a stronger infrastructure.
Howard related a story about about writing code for a TI-99/A for aiming his antenna to conduct Earth-Moon-Earth (EME) communications for his ham radio hobby. He sat down with expert developers to talk about buffer overrun issues. The question that the developers had was, “Why would anyone do that?” Because they can.
The President created the Office of the Cybersecurity Coordinator in a unique way. The Office is part of the National Security Counsel and the National Economic Counsel. Mr. Schmidt has two roles in addressing security issues and ensuring that the system remains open. If specific expertise is needed from other government agencies, those experts can be brought in to assist. Setting strategy and policy is a major effort of the Office. It is also responsible for execution.
The FBI Director has identified the primary and high-level actors in the cyber world:
Foreign intelligence services. They are no longer breaking into buildings and doing surveillance. We have to protect our cyber infrastructure from them.
Terrorist groups. They are interested in critical infrastructure and how to attack it.
Organized crime. They see cyberspace as a business opportunity. Some hacker groups are loosely organized but working together to disrupt the infrastructure.
Mr. Schmidt outlined several programs and initiatives of his office:
Question: What is your vision for Continuous Monitoring?
Answer: It is possible to be FISMA-compliant and still unsecure. The creation of the reports required by the law take away time and effort from actually protecting the infrastructure. The goal now is to use continuous monitoring to deal with issues in real-time.
Question: What are the challenges in getting service providers to allow third-party identifiers?
Answer: We hope that there are multiple drivers for federated IDs. One is a market driver for business. They can reduce costs and lower risks by accepting trusted identifiers. We hope that innovators address some of the technical challenges. Finally as consumers, we have to demand better IDs.
Question: Are we at the point where we need to create a new agency responsible for cybersecurity?
Answer: No. It is not necessary. What we need is coordination, not another branch of government. The Office of Cyber Coordinator is the right model to coordinate activities across government.
Summary by Christine Task.
The fireside chat was an open discussion among several important persons with very interesting positions in the security world. The conversation covered a broad range of topics, as each participant contributed their unique insight and perspective. The summary below will collect just the main points for easy review.
Present were (in seating order):
Dr. J.R. Rao of IBM Research Manager of the Internet Security Group at IBM Research (abbreviated below as IBM)
Howard A. Schmidt, Office of the U.S. President Cyber-Security Coordinator of the Obama Administration (abbreviated below as GOV)
Dr. Eugene Spafford, Purdue Executive Director of Purdue CERIAS (abbreviated below as SPAF)
Sam Curry, RSA Chief Technology Officer, Identity and Data Protection business unit and Chief Technologist for RSA, The Security Division of EMC (abbreviated below as RSA)
The first question addressed was: Why do commercial products still fail to adopt basic security practices, (such as separation of privilege, limited connectivity and minimization of function) even though their importance and efficacy has been well-understood for decades?
RSA: Product designers aren’t security experts; security is usually added as an afterthought and considered an interruption to progress. Although there’s some market pressure for more secure products, there is incredible pressure to be the first to release a new product. The long term outlook gets forgotten. Possibly if contracts included penalties for developers who made obviously vulnerable products or did not properly integrate basic security measures into their products, the balance might be better.
IBM: Security is definitely an afterthought in most product design. On the other end of the scale, though, high assurance ‘ivory tower’ systems exist, but are incredibly expensive to build. One aspect of convincing commercial interests to integrate security policies into their development is finding a good balance among what is effective, efficient, and economically feasible. Currently companies with web-facing applications who are concerned about security often use off-the-shelf products to perform source-code scans. Unfortunately, these aren’t as helpful as they might be, even as after-thoughts. They often produce a flood of output, with little to indicate which faults are actually important, and as a result much of their advice may be disregarded.
SPAF: Some fixes are obvious and simple, like languages which prevent buffer overflows. Why aren’t they in use? The vast majority of people don’t make use of the explosion of features in their gadgets: why don’t product developers practice minimization of features? The problem is that there is basically no liability for security flaws. Potentially, we need to consider penalties for software companies whose security performance is extremely negligent.
GOV: Companies aren’t completely unaware of security concerns; delegation of privileges is much more widespread than it used to be. The difficulty may be that companies don’t understand which security policies are applicable to their products (“it’s secure, it has a password!”). Customers need to demand secure products, or else there’s no market pressure for companies to improve their records. A concern about government regulations, managing security from the top down, is that introducing lawyers limits innovation, and we can’t afford to have an economic disadvantage in the global economy. However, the “Power of Procurement” is a very valuable tool. The government penalizes its contractors/suppliers for obvious security flaws in the products they provide, and this forces higher standards to be adopted within those companies, which helps the standards spread out into the technology ecosystem. There has been visible progress in the past decade.
Next, Spafford asked about the possible worst-case consequences of our slow adoption of good security practices: Is a catastrophic event, a “cyber-security pearl harbor”, possible?
RSA: Every new technology brings concerns like this, and generally we prepare and the threat doesn’t come to pass. Of more concern are less glamorous, slower threats, which we are not defending against: like the involvement of organized crime in technical spheres.
GOV: We actually have been developing tools for a long time, within the DOD, to protect against catastrophic attacks, and we’re working on making those tools available for law enforcement and civilians now as well. What’s more difficult is protecting against these more long-term, subtle threats. Law enforcement has been trained to do computer forensics on localized, physical computers. How do they adapt when an intrusion investigation can easily become a global affair?
IBM: One of these subtle threats is intellectual property loss. It doesn’t take much to remove a company’s competitive edge, and that loss can eventually destroy the company. The FBI has been helpful in tracking IP threats throughout the world, but there are clearly still problems. Commercial tech developers are extremely worried about the security measures which protect their IP, and this may be a good vector for encouraging them to adopt better security practices generally.
This was followed by a slightly more personal question from Dr. Spafford, “What keeps Dr. Rao (IBM) up at night?”
IBM: Intellectual property loss; existing products aren’t sufficient protection. How quickly can an effective approach be developed and adopted?
GOV: A similar issue: The government was able to greatly reduce global issues with money-laundering, by diplomacy with other countries who were blindly enabling it for their own personal, or national benefit. We’re hoping to form a similar global coalition to reduce IP theft: an agreement such that if someone steals your product which you’ve invested deeply in developing, and pushes their version out the door before you, there will be sanctions. There won’t be a market for the pirated product. Also, note that although CEO’s of companies may be concerned about IP protection, the structure of companies often leaves no one actually in charge of managing it: auditors are concerned about financial books rather than security.
RSA: In fact, the CFO’s and audit committees have their own language, and aren’t likely to learn a separate language for security. For example, the word “risk” means very different things to the two groups. If security professionals want to be successful, they need to learn to speak business language; they can’t allow themselves to be separated into a pool of technology talent and kept away from the overall workings of the company.
This prompted the general question: How does a company or a government manage security concerns in a multi-national environment?
GOV: We work diplomatically with other countries on our common cyber-security issues, and our common desire to be able to safely support multi-national companies who have concerns IP protection.
IBM: We sell defensive products in 176 countries, never products to be used for offensive purposes. We never align with any government against any other.
RSA We’re in an interesting situation as a multi-national company: we actually work with many, many different governments and thus have personnel with security clearances in a variety of countries. We use a pools of trust system to make certain sensitive information stays segregated within the company.
The speakers then responded to three questions which had been previously submitted by audience members:
How do we deal with the fact that the critical infrastructure we need to protect is often owned by a variety of small regional businesses?
GOV: Again, the power of procurement allows the government to help encourage high standards of security for the products which these smaller companies use.
IBM: The national labs and IBM have worked together with regional utilities to roll out an extremely secure, well-designed smart grid system. This is another way in which private-public partnerships can improve security generally.
SPAF: However, the government can’t cover every small utility. Really effective new security is often prohibitively expensive for these small businesses. We need to find ways for them to break needed improvements into a sequence of small, gradual changes and amortize the costs over time.
RSA: Even large utilities have very small IT departments, and often a large age and cultural gap between the old staff and the new tech experts. The two groups don’t communicate well, and incredibly valuable knowledge is being lost as people retire. This endangers the security of the entire system. Is there any way we can change the model/organization of these institutions to prevent this?
Will users, rather than the corporations they deal with, ever have direct control over their own privacy?
GOV: This is very important, and it needs to happen sooner rather than later. Unfortunately, we’ve already gone a long way down the wrong path, and it may be very difficult to get back.
Nine years ago, Dr. Spafford collaborated on a list of the [Grand Challenges for Cyber-security] (https://www.cerias.purdue.edu/assets/pdf/bibtex_archive/01264859.pdf). What progress has been made?
SPAF: Progress has been made against epidemic attacks, such as flash worms. Now we’re dealing with slower penetration by bot-nets, and we’re getting better at fighting those as well. There is considerable work left to be done, in general, though.
IBM: There is industry inertia, but active work is being done on these.
RSA: These are very useful rallying points, things we should continue to work on. He once got a question from a German reporter at an RSA conference, “When will we solve this security thing?” This was his favorite question ever. It’s all, always, a work in progress. Right now, it’s very important that existing security is made effortless for users, so it’s commonly adopted.
GOV: We actually have a hard time comparing the costs and prosecution rates of these cyber-attacks to the costs of physical attacks, such as burglaries. Only 3% of cyber-attacks were prosecuted (in a recent year), but what percentage of burglaries are prosecuted? What’s the relative cost? In general, we need to educate people about simple ways of defending themselves.
SPAF: To achieve widespread adoption, security needs to be made effortless and economic. We can’t hope to succeed by telling people what “not” to do. We need to build security into products, so there’s no choice necessary: so users aren’t even aware it’s there.
Tuesday, April 3, 2012
Panel summary by Robert Winkworth.
The panel was moderated by Keith Watson, CERIAS, Purdue University.
In light of its unprecedented growth, wireless mobile communications remains a major focus of security research. The stated purpose of this panel was to address the challenges in securing data and processing, limiting communication to designated parties, protecting sensitive data from loss of device, and handling new classes of malware.
Professor Bagchi opens the discussion with these key points and predictions:
MITRE’s David Keppler joins the discussion with these thoughts:
CACI’s Jeremy Rasmussen contributes:
The audience submits questions:
Attendant: “What will it take to make mobiles as secure as desktops?”
David: “I would argue that the vulnerabilities of a handheld are actually no worse than those of a laptop. A proper risk assessment should be done for each. Expect that exploits will always be possible, but invest for them accordingly.”
Saurabh: “Protocols and architecture need to be standardized. This will be helpful to developers. And we need openness in standards.”
Attendant: “Does it seem inevitable that Android will allow lower-level access to the hardware in the future?”
Jeremy: “Yes, and that can benefit the user, who really should unlock the device and install a personalized solution. We must have root access to the phone to get better security. An app cannot protect the user from system abuses that occur at a lower level than app.”
David: “I agree. What we must do is break the current security in order to rebuild it in a more robust way. There are also some underling market issues at work here. Commercial products are unfortunately vendor-specific, but need to be standardized. How can this happen where there is DRM?”
Attendant: “What are the key differences in user experience between desktop and mobile?”
Saurabh: “Energy consumption, bandwidth, and limitations in the user interface.”
David: “Users trust mobiles MORE rather than less than their desktops. They have not grasped the magnitude of the mobile threat.”
Keith: “What advice would you have for CSO/CIO as they face these threats?”
Saurabh: “CSOs and CIOs don’t ask me for advice! [laughter] What I would recommend, though is strong isolation between applications, and a means to certify them before loading.”
David: “There are some utilities available that employers can have users run if they’re going to be on a private network. Some risk is inevitable, though. There is no perfect solution.”
Jeremy: “Yes—NAC (Network Access Control) used to be required for user devices if they’d be allowed on a corporate network. We need that for mobiles, but I don’t see how it’s possible; we can be circumvented so easily.”
Tuesday April 3, 2012
Panel Summary by Nabeel Mohamed
The panel was moderated by Joel Rasmus, CERIAS, Purdue University.
A quick review on Big Data:
Big Data represents a new era in data analysis where the volume of the data to analyze is so big that it does not work with current traditional database technologies and algorithms. The size of the data set needs to be collected, stored, shared, analyzed and/or visualized continue to grow as the information has been produced at an unprecedented rate from ubiquitous mobile devices, RFID technologies, sensor networks, web logs, surveillance records, search queries, social networks and so on. Increasing volume of the data is only one challenge of big data, and there are other challenges. In fact, Gartner analyst, Doug Laney, defined big data challenges/ opportunities as 3V’s:
Volume - it refers to the increasing volume of data as mentioned above.
Velocity - it refers to the time constraints in collecting, processing and using the data. A traditional algorithm which can process a small set of data quickly may take days to process a large set and give the results. However, if there is a real-time need such as national security, surveillance, and health care, taking days is not good enough any more.
Variety - it refers to the increasing array of data types that need to be handled. It includes all kinds of structured and unstructured data including audio, video, image data, transaction logs, web logs, web pages, emails, text messages and so on.
First, each of the panelists gave their perspective and their experience with big data analytics.
William S. Cleveland, Shanti S. Gupta Professor of Statistics, Purdue University, mentioning the challenges and experience in handling large volume of data in their research group, described their divide and recombine (D&R) approach to parallelize the processing by dividing the data into small subsets and applying traditional numeric and visualization algorithms on such subsets. They exploit the parallelization exhibits by the data itself. Cleveland described their tool called RHIPE built based on this concept. It is available to the public at www.rhipe.org. RHIPE is a merger of R, a free statistical analysis software and Apache Hadoop, an open source MapReduce framework.
Marc Brooks, Lead Information Security Researcher, MITRE Corporation, mainly focused on anomaly detection in large data sets. He raised the question of how one can detect an anomaly without sufficient test data sets. Further, in his opinion, it is expensive to create such data sets. Brooks sees the trend of moving from supervised learning to unsupervised learning such as clustering due to the above reason. Most of the big data sources provide large amount of unstructured data. We know well to handle structured data as we already have a schema of it. He raised the question of what are the effective ways of handling unstructured data and thinks that there should be a fundamental change in the way we model such data. He also touched on the subject of what it takes to be a data scientist which is becoming an attractive career path these days. He thinks that the skill set is a mixture of software engineering, statistics and distributed systems.
Jamie Van Randwyk, Technical R&D Manager, Sandia National Laboratories, started off with the idea of relativity behind the term “big data”. In his opinion, for different organizations big data means different sizes and complexities. Specially the volume of the data which can be called as big data. Randwyk mentioned that while most commercial entities such as Amazon, Microsoft, Rackspace and so on, handle the big data needs of the industry, Sandia mainly focus on US government agencies. He raised the question that we use Hadoop and other technologies to perform analytics and visualizations on large volume of data, however, we still don’t know how to secure such data in these big data environments. Randwyk and his team deal mainly with cyber data which is mostly unstructured. He pointed out the challenge of analyzing large volumes of unstructured data due to the lack of schema.
Alok R. Chaturvedi, Professor, Krannert Graduate School of Management, Purdue University, started his perspective with the idea that one has to collect as much information possible from multiple sources and make actual information stand out. Chaturvedi briefly explained their big data analytics work involving real time monitoring of multiple markets and multiple assets. A challenge in doing so in the real world is that data is often inconsistent and fragmented. They build behavioral models based on the data feeds from sensors, news feeds, surveys, political, economical and social channels. Based on such models they perform macro market assessment by regions in order to spot opportunities to invest. Chaturvedi thinks that big data analytics is continue to going to play a key role in doing such analysis.
After the initial perspective short talks by each panelist, the floor was open to the questions from the audience.
Q: Is behavioral modelling effective? What are the challenges involved?
A: Panelists identified two ways in which the behavior would change: adversarial activities and perturbation of data or the business itself. It is important to understand these two aspects and build behavioral model accordingly. Also, if the behavioral model does not keep up with the changes, it is going to be less effective in identifying behaviors that one wants to look for. Some of the challenges involved are deciding what matrices to use, defining such matrices, understanding the context (data perturbation vs. malicious activities) and keeping updating the model. It is also important to put the correct causality to the event. For example, 9/11 is due to a security failure not anything else.
Q: Do you need to have some expertise in the field in order to better utilize big data technologies to identify anomalies?
A: Yes, big data analytics will point to some red flags, you need be knowledgeable in the subject matter in order to dig deep and get more information.
Q: Is it practical to do host based modeling using big data technologies?
A: Yes, you have to restrict your domain of monitoring. For example, it may not be practical to do host based monitoring for the whole Purdue network.
Q: How do you do packet level monitoring if the data is encrypted?
A: Cleveland is of the view that one cannot do effective packet level monitoring if the data is encrypted. In their work, they assume that the packets are transmitted in cleartext.
Q: To what extent intelligence response being worked out? Can you do it without the intervention of humans?
A: Even with big data analytics, there will be false positives. Therefore, we still need human in the loop in order to pinpoint the incident accurately. These people should have background in computer security, algorithms, analysis, etc.
A challenge in current big data technologies like Hadoop is that it is difficult to do near real time analysis yet.
Q: (panel to audience) What are your big data problems?
A: (An audience) Our problem is scalability. There is nothing off the shelf that we can buy to meet our need. We have to put a lot of effort to build these system by putting various component together. Instead of spending time on defending attacks, we have to spend a lot of time on operational tasks.
Q: Is it better to have a new framework for big data for scientific data?
A: It is not the science per se that you have to look at; you have to look at the complexity and size of the data in order to decide. From an operational perspective, a definition/framework may not be important, but from a marketing perspective, it may be important. For example, defining the size of the data set could be potentially useful.
Q: We want to manage EHRs (electronic health records) for 60m people. Can these people be re-identified using big data technologies?
A: Even EHR data confirming to safe harbor rules where 18-19 elements are not there may be re-identified. Safe harbor rules are not sufficient, neither they are necessary. They will protect most people, but not all. You can protect even without safe harbor. This is a very challenging problem and CERIAS has an ongoing research project.
Q: Have you seen adversaries intentionally trying to manipulate big data so that they go undetected? Specifically have you seen adversaries that damage the system slowly to stay below the threshold level of detection and that damage very fast to overwhelm the system?
A: We have seen that adversaries understand your protocols, whether your packets are encrypted or not, etc., so that they can behave like legitimate users. I have heard anecdotal stories of manipulating data in bank and other financial institutions, but can’t point to any specific incident.
Q: Often times, we have to reduce the scope, when many parameters are to be analyzed due to the sheer volume of data. How do you ensure that you still detect an anomaly (no false negatives)?
A: You have to analyze all the data otherwise it may result in false negatives.
Tuesday April 3, 2012
Panel Summary by Matt Levendoski
The panel was moderated by Charles Killien, Computer Science, Purdue University.
Dr. Hal Aldridge, the Director of Engineering at Sypris Electronics, opened today’s first panel on the currently popular topic of SCADA security. Dr. Aldridge initially presented his current research interests, which involves the defining of who takes true ownership and responsibility for the security of our nation’s backbone infrastructure, our SCADA and control systems. An interesting opposition he presented was, what if the responsible party doesn’t have a well-defined background in the security realm?
Dr. Aldridge further delved into the aspects of smart grids and the fact that they are everywhere. Hal discussed how it is a scary thought of how much code is being utilized to run the control system of an automobile. In some aspects cars have more code then a variety of our current fighter jets. He further teased about the concept of an Internet based coffee maker. All concepts aside, these systems have their cons, which are present in the form of security problems. Dr. Aldridge closed with the statement that he greatly appreciates the interdisciplinary stance of CERIAS and how this allows for great innovation in the industry and current academic research.
William Atkins, a Senior member of Technical Staff in Sandia National Laboratories, followed up with his stance and the difference between SCADA and control systems. He specifically focuses on general computing systems security. More precisely, he introduced the term ‘cyber physical systems’. He presented the recent trend that calls for these systems to have inter-compatibility because customers don’t want to be locked into a single vendor for their solutions. He further stressed that this topic is vague and largely unknown which has created a lot of media attention, more specifically topics like the stuxnet worm.
William further addressed the current trends of security as they relate to control systems. These systems are changing from a less manual or analog approach to a more automated and digital methodology. We want our systems to do more yet require less. This trend tends to bring about unforeseen consequences, especially when these systems hit an unknown state of inoperability. Additionally, all the hypothetical attacks being posed to the public are actually becoming a reality. Attackers now have the capability to purchase or acquire the hardware online via surplus sales, eBay, or the like.
William closed with his perspective on SCADA security and how the odds are asymmetrically stacked in favor of the offense verses the defense. Essentially, security tends to get in the way of security. The stuxnet worm is a great example in that it utilized vulnerabilities within the access level of anti-virus software that allowed for a lower level approach to the attack.
Jason Holcomb is a Senior Security Consultant at Lockheed Martin in Energy and Cyber Services. He opened his panel discussion with an interesting spin on how he got involved with SCADA security. Jason indirectly introduced a denial of service conflict within the SCADA system he was working on in which he had to, in turn, remediate.
Jason presented Lockheed’s current approach to the security threads within SCADA systems. Their current research and solutions look to bring some of the advantage back to the defense. This was a great contrast to the perceptions that William Atkins previously presented. Jason then further introduced the following Cyber Kill Chain:
Steven Parker is the Vice President of Technology Research and Projects with the Energy Sector Security Consortium. Steven stated that when it comes to control systems and SCADA, we don’t need to necessarily solve the hard problems but focus more on easy solutions. Steven then continued to compare the security industry with that of the diet industry. A few of his comparisons included how the diet industry has Dietitians and we have CISSPs, they have nutritional labeling and we have software assurance, everyone wants a no effort weight loss program while security wants an easy solution for everything, and lastly the diet industry has a surgical procedure called gastric bypass where the security industry has something called regulations and compliance. He then closed with the notion that a lot of challenges aren’t all necessarily technical. These challenges include economic strategies, human interactions, public policy, and legal issues.
Lefteri Tsoukalas is a Professor of Nuclear Engineering at Purdue University. Prof. Tsoukalas jumped right into making the statement that the energy markets are currently undergoing a phase transition. Demand isn’t affected by high prices as the resources have changed state from abundance to resource scarcity. This is why energy allocation is key. We need to utilize our resources when energy prices are lower rather then during peak cost timeframes. Prof. Tsoukalas also suggested that we take the same perspective as Europe and look into alternative resources. At this point in time we aren’t sitting as comfortably on our current supply of energy resources as we were, say, 100 years ago.
Question 1: There is a lot of research in SCADA/Control Systems. How do we adapt our research to be more applicable to Control Systems?
Question 2: How do we get a handle on global regulations?
Question 3: What skills do students and staff need, to be affective in this area?
Question 4: What type of attacks have you actually experienced?
Further discussion was taken from the following perspective: