Tuesday, April 5, 2011
Panel Members:
Panel Summary by Pratik Savla
Edward Talbot initiated the discussion by presenting his viewpoint on Cyber security. He described himself as a seasoned practitioner in the field of cyber security. He highlighted his concerns for cyber security. The systems have become too complicated to provide an assurance of having no vulnerabilities. It is an asymmetrical problem. For an intruder, it may just take one door to penetrate the system but for the person managing the system, he/she would need to manage a large number of different doors. Any digital system can be hacked and any digital system that can be hacked will be hacked if there is sufficient value in that process. Talbot described problems in three variations: near-term, mid-term and long term. He used a fire-fighting analogy going back two centuries when on an average a U.S. city would be completely gutted and destroyed every five years. If the firefighters were asked about their immediate need, they would say more buckets are required. But, if they were asked what to do to prevent this from happening again, they had no answer. Talbot placed this concern into three time-frames: near-term, mid-term and long term. The first time frame involves the issue of what to do today to prevent this situation. The second timeframe tries to emphasize that it is important to be ahead of the game. The third timeframe involves the role of science. In this context, the development of a fire science program in academia. To summarize, he pointed out that the thinking that gets one into a problem is insufficient to get one out of the problem.
Talbot quoted a finding from the JASON report on the science of cyber security which stated that the highest priority should be assigned to the establishment of research protocols to enable reproducible experiments. Here, he stated that there is a science of cyber security. He concluded by comparing the scenario to being in the first step of a 12-step program (borrowing from Alcoholics Anonymous). It means to stop managing an unmanageable situation and instead developing a basis to rethink what one does.
Rogers focused on the the question: Do we have foundations that are scientifically based that can help answer some of the questions in form of research? Are we going in the right direction? This lead to a fundamental question: how we define a scientific foundation? What defines science? He highlighted some common axioms or principles such as body of knowledge, testable hypotheses, rigorous design and testing protocols and procedures, metrics and measurements, unbiased results and their interpretation, informed conclusions, repeatability as well as feedback into theory that are found across different disciplines. The problems that one comes across are non-existence of natural laws, man-made technologies in constant flux, different paradigms of research such as observational, experimental and philosophical, non-common language, extent of reliability and reproducibility of metrics, difference in approach such as applied versus basic, studying symptoms as opposed to causes. Cyber security is informed by a lot of disciplines such as physics, epidemiology, computer science, engineering, immunology, anthropology, economics and behavioral sciences.
The JASON report on the science of cyber security came out with strategies that are areas such as modeling and simulation which involved biological, decisional, inferential, medical as well as behavioral models that could be considered when viewing it on a scientific foundation. He emphasized that cyber security problems lend themselves to a scientific based approach. He stressed that there will be a scientific foundation for cyber security only if it is done correctly and only when one is conscious about what constituted a scientific foundation. Even solutions such as just-in-time, near-term and long-term can be based on a scientific foundation.
He pointed out that currently the biggest focus was on behavioral directive. In other words, how do we predict what will happen 20 years from now if employee ‘X’ is hired?
Shannon addressed the question: How do we apply the scientific method? Here, he presented the software engineering process. He discussed its various components by describing the different issues each one addresses. Firstly, what data do we have? What do we know? What can we rely on? What is something that we can stand on which is reasonably solid? Secondly, why do we have data that is prone to exploitation? He highlighted reasons such as lack of technology as well as mature technology, lack of education and lack of capacity. Here, he concluded that these hypotheses do not seem to stand the test of data as the data indicated we have always had problems. He then stated some alternative hypothesis such as market forces, people and networks that can be considered. He stressed on the point that solutions are needed based on what people and systems do, not what we wish they would do. The stumbling block for such a case is the orthodoxy of cyber security which means being in the illusion that by just telling people to do the right thing and using the right technology would lead to a solution to a problem. It is analogous to an alchemist who would state that just by telling the lead to turn gold, it would become gold. He stressed that we need to understand what is going on and what is really possible. The key message was that if there is a science that is built on data, it would involve much more than just theory.
Raskin took a more general view of cyber science by offering some of his thoughts on the subject. He said that he did not agree to the “American” definition of science which defines it as a small sub-list of disciplines where experiments can be run and immediate verification is possible as he considered it to be too narrow. He conformed to the notion of science wherein any academic discipline that is well-defined is a science. He presented a schematic of the theory-building process. It involved components such as phenomena which corresponded to a purview of the theory, theory, methodology and the description, which is a general philosophical term for results. The theory is connected to the methodology and a good theory would indicate why it can help guide the methodology. He asked why we were not questioning what we were doing. The first thought was related to the issue of data provenance i.e. why are you doing what are you doing? The second thought focused on the question of how we deal with different sciences that all part of cyber science. A mechanism that can help address that is that of rigorous application. He disagreed with the notion that combining two things without any import/export of sub-components leads to some worthy result. He stated that from the source field, components such as data, theory and methods should be imported to the target field. Only the problems of the source field should be excluded from being imported. The second thought emphasized about forming a linkage between the two fields; source and target by a common application. He concluded that without a theory, one does not know what one is doing and one does not know why one is doing it? It does not imply that there is no theory in existence. On the contrary, anything that is performed has an underlying theory and one may not be having any clue about that theory.
A question about complexity theory brought up an example of a bad scientific approach wherein the researcher adds more layer of complexity or keeps changing the research question but does not ever question the underlying theory which may be flawed.
Tuesday, April 5, 2011
Panel Members:
Panel Summary by Nikhita Dulluri
In the first session of the CERIAS symposium, the theme of ‘Traitor Tracing and Data Provenance’ was discussed. The panelists spoke extensively about the various aspects relating to tracing the source of a given piece of data and the management of provenance data. The following offers a summary of the discussion in this panel.
With increasing amounts of data being shared among various organizations such as health care centers, academic institutions, financial organizations and government organizations, there is need to ensure the integrity of data so that the decisions based on this data are effective. Providing security to the data at hand does not suffice, it is also necessary to evaluate the source of the data for its trust-worthiness. Issues such as which protection method was used, how the data was protected, and whether it was vulnerable to any type of attack during transit might influence how the user uses the data. It is also necessary to keep track of different types of data, which may be spread across various domains. Identification of the context of the data usage i.e., why a user might want to access a particular piece of data or the intent of data access is also an important piece of information to be kept track of.
Finding the provenance of data is important to evaluate its trustworthiness; but this may in-turn cause a risk to privacy. In case of some systems, it may be important to hide the source of information in order to protect its privacy. Also, data or information transfer does not necessarily have to be on a file to file exchange basis- there is also a possibility that the data might have been paraphrased. Data which has a particular meaning in a given domain may mean something totally different in another domain. Data might also be given away by people unintentionally. The question now would be how to trace back to the original source of information. A possible solution suggested to this was to pay attention to the actual communication, move beyond the regions where we are comfortable and to put a human perspective on them, for that is how we communicate.
Scale is one of the major issues in designing systems for data provenance. This problem can be solved effectively for a single system, but the more one tries to scale it to a higher level, the less effective the system becomes. Also, deciding how much provenance is required is not an easy question to answer, as one cannot assume that one would know how much data the user would require. If the same amount of information as the previous transaction was provided, then one might end up providing excess (or insufficient) data than what is required.
In order to answer the question about how to set and regulate policies regarding the access of data, it is important to monitor rather than control the access to data. Policies when imposed at a higher level are good, if there is a reasonable expectation that people will act accordingly to the policy. It is important not to be completely open about what information will be tracked or monitored, as, if there is a determined attacker, this information would be useful for him to find a way around it.
The issue of data provenance and building systems to manage data provenance has importance in several different fields. In domains where conclusions are drawn based on a set of data and any alterations to the data would change the decisions made, data provenance is of critical importance. Domains such as the DoD, Health care institutions, finance, control systems and military are some examples.
To conclude, the problem of data provenance and building systems to manage data provenance is not specific to a domain or a type of data. If this problem can be solved effectively in one domain, then it can be extended and modified to provide the solution to other domains as well.
Tuesday, April 5, 2011
Keynote Summary by Mark Lohrum
Neal Ziring, the current technical director for the Information Assurance Directorate at the NSA, was given the honor of delivering the opening keynote for the 2011 CERIAS Symposium on April 5th at Purdue University. He discussed the trends in cyber threats from the 1980s to today and shifts of defenses in response to those threats. He noted that, as a society, we have built a great information network, but unless we can trust it and be defended against possible threats, we will not see the full potential of a vast network. Ziring’s focus, as an NSA representative, was primarily from a perspective of preserving national interests regarding information security.
Ziring discussed trends in threats to information security. In the 1980s, the scope of cyber threats was rather simple. Opposing nations wished to obtain information from servers belonging to the U.S., so the NSA wished to stop them. This was fairly straightforward. Since the 1980s, threats have become far more complex. The opponents may not be simply opposing countries; they may be organized criminals, rouge hackers, hacktivists, or more. Also in years past, much expertise was required to complete attacks. Now, not so much expertise is required, which results in more threat actors. In the past, attacks were not very focused. Someone would write a virus and see how many computers in a network in can effect, almost as if it were a competition. Now, attacks are far more focused on achieving a specific goal aimed at a specific target. Ziring cited a statistic that around 75% of viruses are targeted at less than 50 individual computers. Experts in information security must understand the specific goals of a threat actor so attacks can be predicted.
Ziring also discussed shifts in information security. The philosophy used to be to simply protect assets, but now the philosophy includes defending against known malicious code and hunting for not yet known threats. Another shift is that the NSA has become increasingly dependent upon commercial products. In the past, defenses were entirely built internally, but that just does not work against the ever-changing threats of today. Commercial software advances at a rate far faster than internal products can be developed. The NSA utilizes a multi-tiered security approach because all commercial products contain certain shortcomings. Where one commercial product fails to protect against a threat, another product should be able to counter that threat; this concept is used to layer security software to fully protect assets.
A current concern in information security is the demand for mobility. Cell phones have become part of everyday life, as we as a society carry them everywhere. As these are mobile networking computers, the potential shortcomings of security on these devices is a concern. If they are integrated with critical assets, a security hole is exposed. Similarly, cloud computing creates a concern. Integrity of information on servers which the NSA does not own must be ensured.
Ziring brought up a couple of general points to consider. First, information security requires situational awareness. Knowing the current status of critical information is necessary to defending it properly, and knowing the status of the security system consistently is required. Currently, many security systems are audited every several years, but it may be better to continuously check the status of the security system. And secondly, operations must be able to operate on a compromised network. The old philosophy was to recover from a network compromise, then resume activity. The new philosophy, because networks are so massive, is to be able to run operations while the network is in a compromised state.
Ziring concluded by discussing the need to create academic partnerships. Academic partnerships can help the NSA have access to the best researchers, newer standards, and newer technologies. Many of the current top secure systems would not have been possible without academic partnerships. It is impossible for the NSA to employ more people than the adversaries, but it is possible to outthink and out-innovate them.
I have not been blogging here for a while because of some health and workload issues. I hope to resume regular posts before too much longer.
Recently, I was interviewed about the current state of security . I think the interview came across fairly well, and captured a good cross-section of my current thinking on this topic. So, I'm posting a link to that interview here with some encouragement for you to go read it as a substitute for me writing a blog post:
Complexity Is Killing Us: A Security State of the Union With Eugene Spafford of CERIAS
Also, let me note that our annual CERIAS Symposium will be held April 5th & 6th here at Purdue. You can register and find more information via our web site.
But that isn't all!
Plus, all of the above are available via RSS feeds. We also have a Twitter feed: @cerias. Not all of our information goes out on the net, because some of it is restricted to our partner organizations, but eventually the majority of it makes it out to one of the above outlets.
So, although I haven't been blogging recently, there has still been a steady stream of activity from the 150+ people who make up the CERIAS "family."
followed by a reboot. You can also do this immediately, which will be good only until you reboot (note: sudo alone doesn't work, you need to do "sudo su -"):
net.ipv6.conf.all.disable_ipv6 = 1
net.ipv6.conf.default.disable_ipv6 = 1
net.ipv6.conf.lo.disable_ipv6 = 1
echo 1 > /proc/sys/net/ipv6/conf/all/disable_ipv6
echo 1 > /proc/sys/net/ipv6/conf/default/disable_ipv6
echo 1 > /proc/sys/net/ipv6/conf/lo/disable_ipv6