The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

CERIAS Blog

Page Content

Symposium Transcript: Complexity vs. Security—Choosing the Right Curve, Morning Keynote Address

Dr. Ronald W. Ritchey, Booz, Allen, Hamilton Transcribed and edited by Jacques Thomas. The speaker was introduced by Joel Rasmus. Dr. Ron Ritchey and Booz Allen Hamilton (BAH), the company he works for, have had a relationship with CERIAS for the last couple of years now. This has been a very good relationship; one that has continued to grow. BAH does contracting for the government and for the IATAC, which does IA consulting and testing for governmental agencies and vendors. His work is of interest to a lot of us, with the unique perspective of working inside the government. Dr Ritchey, in addition to his duties with BAH and as chief scientist for the IATAC, also occasionally teaches at George Mason University in Fairfax, Virginia. We want to thank Dr. Ritchey for readily accepting our invitation to speak at the Symposium on his first visit to Purdue. As many of you know, one of Dr. Ritchey's colleagues, Admiral Mike McConnell, was supposed to participate in yesterday's fireside chat. Unfortunately, he could not attend it. Dr. Ritchey kindly substituted for him. We thank Dr. Ritchey for going above and beyond his commitments in helping this symposium be a success. Today's talk is a talk that Dr. Ritchey had in the works and proposed to finalize for the Symposium. When we asked him if he would like to come give a talk, we did not have to prompt him. He said he had this talk that he had been working on, and that would be good for me to go ahead and finish this talk. The gist of the talk clearly showed that it matched CERIAS's interest in security. We are sure that this talk will raise some eyebrows in the audience.

Symposium Summary: Complexity vs. Security—Choosing the Right Curve, Morning Keynote Address

A keynote summary by Gaspar Modelo-Howard. Dr. Ronald W. Ritchey, Booz, Allen and Hamilton Ronald Ritchey is a principal at Booz Allen Hamilton, a strategy and technology consulting firm, and chief scientist for IATAC, a DoD initiative to provide authoritative cyber assurance information to the defense community. He spoke about software complexity and its relation to the number of vulnerabilities found in software. Ritchey opened the talk sharing his experience as a lecturer for a secure software development course he gives at George Mason University. The objective of the course is to allow students to understand why emphasis on secure programming is so important. Using the course dynamics, he provided several examples on why secure programming is not easy to achieve: much of the code analysis to grade his course projects includes manual evaluation which makes the whole process long, even students with good development skills usually have vulnerabilities in their code, and some students insert vulnerabilities by calling secure-sounded libraries in insecure ways. All these examples allowed Ritchey to formulate the following question: How hard can it be to write good/secure software? Ritchey then moved on to discuss software complexity. He presented the following statement: software products tend toward increasing complexity over time. The reason is that to sell the next version of a program, market is expecting to receive more features, compared to previous version. To add more features, more code is needed. Software is getting bigger, therefore more complex. So in light of this scenario: Does complexity correlate to software faults? Can we manage complexity for large development projects? And, should development teams explicitly limit complexity to what they have demonstrated they can manage? Several security experts suggest that complexity increases security problems in software. Quoting Dan Geer, “Complexity is the enemy”. But Ritchey mentioned that researchers are divided on the subject. Some agree that complexity is a source of vulnerabilities in code. The latest Scan Open Source Report[1] found strong linear correlation between source lines of code (SLOC) and number of faults, after the analysis of 55 million SLOC from 250 open source projects. Shin & Williams[2] suggest that vulnerable code is more complex than faulty code after analyzing the Mozilla JavaScript engine. Some researchers suggest there is no clear correlation. Ozment and Schechter[3] found no correlation after analysis of the OpenBSD operating system which is known for its developers’ focus on security. Also, Michael Howard of Microsoft Corp. pointed out that even though Windows Vista’s SLOC is higher than XP, Vista is experiencing a 50% reduction in its vulnerability count and this is attributed to their secure development practices. Regardless of the relationship between complexity and security, Ritchey mentioned it is likely that SLOC is a weak metric for complexity and suggested potential replacements in terms of code structure (cyclomatic complexity, depth of inheritance), computational requirements (space, time), and code architecture (number of methods per class, lack of cohesion of methods). Looking at different popular programs, it is clear that all are becoming larger as new versions are released. MacOS X v10.4 included 86M SLOC and Ubuntu Linux has 121M. Browser applications also follow this trend, with Internet Explorer v6 included 7M SLOC and Firefox v3 has 5M. A considerable percentage of these products doubled their sizes between versions: Windows NT4 has more than 11M SLOC and its later version XP has 40M, Debian v3 has 104M and v4 jumped to 283M. In light of the different opinions and studies presented, Ritchey analyzed the Microsoft Windows operating system by counting the vulnerabilities listed on the National Vulnerabilities Database[4] for different versions of this popular system. No distinction was made between the root level compromise and other levels. From the results presented, a large number of vulnerabilities were found after the initial release of the different Windows versions. Such trend represents the initial interest shown by researchers to find vulnerabilities who later moved to newer versions or different products. Ritchey also commented on the impact of the foundational (initial release) code, which seems to have a higher vulnerability rate than later added code from updates. From the cumulative vulnerability count vs. complexity (SLOC) graph shown, lines go up so it might be true that complexity impacts security. He alerted though on need to be careful on how to judge these numbers since factors such as quantity and quality of resources available to development team, popularity of software, and operational and economic incentives might impact these numbers. Throughout his talk, Ritchey emphasized that managing complexity is difficult. It requires a conscious cultural paradigm shift from the software development team to avoid and remove faults that lead to security vulnerabilities. And as a key point from the talk, a development team should know at a minimum how much complexity can be handled. Ritchey then concluded that complexity does impact security and the complexity found in code is increasing, at a plausible rate of 2x every 5 to 8 years. The foundational code usually contributes to the majority of vulnerabilities reported. The ability to prevent vulnerability rates from increasing is tied to the ability to either limit the complexity or improve how we handle it. The speaker (calls himself an optimist and) believes that shift from software as a product to software as a service is good for security since it will promote sound software maintenance and move industry away from adding features just to sell new versions. ## References 1. Coverity, Inc. Scan Open Source Report 2008. Available at [http://scan.coverity.com/](http://scan.coverity.com). 2. Shin, Y. and Williams, L.: Is complexity really the enemy of software security? In: 4th ACM workshop on Quality of protection, pp. 47—50. ACM, New York, NY, USA. 3. Ozment, A. and Schechter, S.: Milk or Wine: Does Software Security Improve with Age? In: 15th USENIX Security Symposium, pp. 93—104. Usenix, Berkeley, CA, USA. 4. National Institute of Standards and Technology. National Vulnerability Database. Available at [http://nvd.nist.gov](http://nvd.nist.gov).

Symposium Summary: Unsecured Economies Panel

A panel summary by Kripa Shankar. Panel Members: * Karthik Kannan, Krannert School of Management, Purdue University * Jackie Rees, Krannert School of Management, Purdue University * Dmitri Alperovitch, McAfee * Paul Doyle, ProofSpace * Kevin Morgan, Arxan Technologies Adding a new dimension to the CERIAS 10th Annual Security Symposium, five of the panelists with varied background came together on the final day to share their work and experiences on “Unsecured Economies: Protecting Vital IP”. Setting the platform for this discussion was this [report](http://resources.mcafee.com/content/NAUnsecuredEconomiesReport). “Together with McAfee, an international team of data protection and intellectual property experts undertook extensive research and surveyed more than 1,000 senior IT decision makers in the US, UK, Japan, China, India, Brazil and the Middle East regarding how they currently protect their companies digital data assets and intellectual property. A distributed network of unsecured economies has emerged with the globalization of many organizations, leaving informational assets even more at risk to theft and misuse. This report investigates the cybercrime risks in various global economies, and the need for organizations to take a more holistic approach to vulnerability management and risk mitigation in this ever-evolving global business climate.” Karthik Kannan, Assistant Professor of Management Information Systems, CERIAS, Krannert School of Management, Purdue University was the first to start the proceedings. He gave a brief overview of the above report, which was the product of the collaborative research done by him, Dr. Jackie Rees and Prof. Eugene Spafford as well. The motivation behind this work, was that more and more information was becoming digital and traditional geographic boundaries were blurring. Information was being outsourced to faraway lands and as a result protecting leaks was becoming harder and harder. Kannan, put forth questions like: “How do perceptions and practices vary across economies and cultures?”, and sighted an example from India where salary was not personal information, and was shared and discussed informally. To get answers for more such questions, a survey was devised. This survey was targeted at senior IT decision makers, Chief Information Officers and directors of various firms across the globe. US, UK, Germany, Brazil, China and India were among the countries chosen, giving the survey the cultural diversity element that it needed. Adding more value to the survey was the variety of sectors: Defense, Retail, Product Development, Manufacturing and Financial Services. According to results of the survey, a majority of the intellectual property (47%) originates from North America and Western Europe, and on an average firms lost $4.6 million worth of IP last year. Kannan went on to explain how security was being perceived in developing countries, and also discussed how respondents reacted to security investment during the downturn. Statistics like: 42% of the respondents saying laid-off employees are the biggest threat caused by the economic downturn, showed that insider threats were on the rise. The study put forth many case studies to show that data thefts from insiders tend to have greater financial impact given the high level of data access, and an even greater financial risk to corporations. Jackie Rees, also an Assistant Professor of Management Information Systems, CERIAS, Krannert School of Management, Purdue University took it up from where Kannan had left and brought to light some of the stories that did not go into the report. Rees explained the reasons behind the various sectors storing information outside the home country. While Finance sector viewed it as being safer to store data elsewhere; the IT , Product Development and Manufacturing sectors found it to be more efficient for the supply chain; and the Retail and Defense sector felt better expertise was available elsewhere. Looking at the perspective on the amount that these sectors were spending on security, 67% of the Finance industry said it was “just right”, while “30%” of Retail felt it was “too little”. The other results seemed varied but consistent with our intuitions, however all sectors seemed to agree that the major threat to deal with was “its own employees”. The worst impact of a breach was on the reputation of the organization. Moving on to the global scene where geopolitical perceptions have become a reality in information security policies, Rees shared that certain countries are emerging as clear sources of threats to sensitive data. She added that Pakistan is seen as big threat by most industries according to respondents while China and Russia are in the mix. Poor law enforcement, corruption and lack of cooperation in these economies were sighted as a few reasons for them to emerge as threats. Dmitri Alperovitch, Vice President of Threat Research, McAfee Corporation began by expressing his concern over the fact that Cybercrime is one of the headwinds hitting our economy. He pointed out that the economic downturn has resulted in less spending on security, and as a result increased vulnerabilities and laid of employees were now the serious threats. Elucidating, he added that most of the vulnerabilities are used by insiders who not only know what is valuable, but also know how to get it. Looking back at the days when a worm such as Melissa that was named after the attacker’s favorite stripper seems to be having a much lesser malicious intent that those of today, where virtually all threats now are financially motivated and more to do with money laundering. Sighting examples, Alperovitch told us stories of an organization in Turkey that was recently caught for credit and identity theft, of members of law enforcement being kidnapped, and of how Al-Qaeda and other terrorist groups were using such tools to finance terrorist groups and activities. Alperovitch vehemently stressed on the problem that this threat model was not understood by the industry and hence the industry is not well protected. Paul Doyle, Founder Chairman & CEO, Proofspace began by thanking CERIAS and congratulating the researchers at McAfee for their contributions. Adding a new perspective of thinking to the discussion, Doyle proposed that there has not been enough control over the data. Data moves over supply chain, but “Control” does not move. Referring to yesterday’s discussion on cloud computing, where it was pointed out that availability is a freebie, Doyle said the big challenge here was that of handling integrity of data. Stressing on the point he added that integrity of data is the least common divisor, and that it was the least understood area in security as well. How do we understand when a change has occurred? In the legal industry, we have a threat factor in the form of a cross-examining attorney. What gives us certainty in other industries? We have not architected our systems to handle the legal threat vector. Systems lack the controls and audit ability needed for provenance and ensured integrity. Trust Anchor of Time has to be explored. “How do we establish the trust anchor of time and how confidentiality tools can help in increasing reliabilities?” are important areas to work on. Kevin Morgan, Vice President of Engineering, Arxan Technologies began with an insight on how crime evolves in perfect synchrony with the socio-economic system. Every single business record is accessible in the world of global networking, and access enables crime. Sealing enterprise perimeters has failed, as there is no perimeter any more. Thousands and thousands of nodes execute business activity, and most of the nodes (like laptops and smart phones) are mobile, which in turn means that data is mobile and perimeter-less. Boundary protection is not the answer. We have to assume that criminals have access to enterprise data and applications. Assets, data and applications must be intrinsically secure and the keys protecting them must be secure too. Technology can help a great deal in increasing the bar for criminals and the recent trends are really encouraging. After the highly informative presentations, the panel opened up for questions for the next hour. A glimpse of the session can be found in the transcript of the Q&A session below. ## Q&A Session: A transcript snapshot Q: We are in the Mid-West, no one is going to come after us. What should I as a security manager consider doing? How do you change the perception that organizations in "remote" locations are also subject to attack? * Alperovitch: You are cyber and if you have valuable information you will be targeted. Data manipulation is what one has to worry about the most. * Morgan: Form Red teams, perform penetration tests and share the results with the company. * Doyle: Employ allies and make sure you are litigation ready. Build a ROI model and lower total cost of litigation. Q: CEOs consider cutting costs. They cut bodies. One of the biggest threats to security is letting the people go. It’s a paradox. How do we handle this? * Kannan: We have not been able to put a dollar value to loss of information. Lawrence Livermore National Lab has a paper on this issue which might be of interest to you. * Rees: Try to turn it into a way where you can manage information better by adding more controls. Q: How do we stress our stand on why compliance is important? * Doyle: One of our flaws as a professional committee is that we are bad in formulating business cases. We have to take a leaf out of Kevin’s (of Cisco) books who formulates security challenges into business proposals. Quoting an analogy, at the end of the day it is the brakes and suspensions are the ones that decide the maximum speed of the automobile, not the engine or the aerodynamics. The question is: How fast we can go safely? Hence compliance becomes important. Q: Where do we go from here to find out how data is actually being protected? * Kannan: Economics and behavioral issues are more important for information security. We need to define these into information security models. * Rees: Governance structure of information must also be studied. * Alperovitch: The study has put forth those who may be impacted by the economy. We need to expose them to the problem. Besides we also need to help law enforcement get information from the private sector as the laws are not in place. We also need to figure out a way to motivate companies to share security information and threats with the community. * Doyle: Stop thinking about security and start thinking about risk and risk management. Model return-reward proposition in terms of risk. * Morgan: We need to step up as both developers and consumers. Q: The $4.6 million estimate. How was it estimated? * Rees: We did a rolling average across the respondents, keeping in mind the assumption that people underestimate problems. Q: Was IP integral to the business model of a company that there was a total loss causing the company to go bust? * Rees: We did not come across any direct examples of firms that tanked and fell because of IP loss. Q: Could you suggest new processes to enforce security of data? * Doyle: We need to find ways from the other side. If we cannot stop them, how do we restrict and penalize them using the law? Q: Infrastructure in Purdue and US has been there for long and we have adapted and evolved to newer technologies. However other old organization and developing countries are still backward, and it actually seems to be helping them, as they need to be less bothered with the new-age threats. What’s your take on that? * Kannan: True. We spoke to the CISO of a company in India. His issues were much less as it was a company with legacy systems. * Alperovitch: There is a paradigm shift in the industry. Security is now becoming a business enabler.

Symposium Summary: Distinguished Lecture

A summary written by Nabeel Mohamed. The main focus of the talk was to highlight the need for "information-centric security" over existing infrastructure centric security. It was an interesting talk since John was instrumental in providing real statistics to augment his thesis. Following are some of the trends he pointed out from their research: * Explosive growth of information: Digital content in organization grows by about 50% every year. * Most of the confidential/sensitive information or trade secrets of companies are in the form of unstructured data such as emails, messages, blogs, etc. * The growth of malicious code in the market place out-paces that of legitimate code. * Attackers have found ways to get around network protection and get at the sensitive/confidential information leaving hardly any trace most of the time. Attackers have changed their motivation; they no longer seek big press and they want to hide every possible trace regarding the evidence of attacks. * Threat landscape has changed markedly over the last ten years. Ten years ago there were only about five viruses/malicious attacks a day, but now it's about staggering 15,000 a day. * The research conducted by the Pondemon Group asked laid-off employees if they left with something from the company and 60% said yes. John thinks that the figure could be still higher as there may be employees who are not willing to disclose it. These statistics show that data is becoming increasingly important than ever before. Due to the above trends, he argued that protecting infrastructure alone is not sufficient and a shift in the paradigm of computing and security is essential. We need to change the focus from infrastructure to information. He identified three elements in the new paradigm: 1. It should be risk-based. 2. It should be information centric. 3. It should be managed well over a well-managed infrastructure. John advocated to adopt a risk-based/policy-based approach to manage data. A current typical organization has strong policies on how we want to manage the infrastructure, but we don't have a stronger set of policies to manage the information that is so critical to the business itself. He pointed out that it is high time that organizations assess the risk of loosing/leaking different types information they have and devise policies accordingly. We need to quantify the risk and protect those data that could cause high damage if compromised. Identifying what we want to protect most is important as we cannot protect all adequately. While the risk assessment should be information-centric, one may not achieve security only by using encryption. Encryption can certainly help protect data, but what organizations need to take is a holistic approach where management (for example: data, keys, configurations, patches, etc.) is a critical aspect. He argued that it is impossible to secure without having knowledge about the content and without having good policies on which to base organizational decisions. He reiterated that "you cannot secure what you do not manage". To reinforce the claim, he pointed out that 90% of attacks could have been prevented had the systems came under attack been managed well (for example, Slammer attack). The management involves having proper configurations and applying critical updates which most of the vulnerable organizations failed to perform. In short, well-managed systems could mitigate many of the attacks. Towards the end of his talk, he shared his views for better security in the future. He predicted that "reputation-based security" solutions to mitigate threats would augment current signature-based anti-virus mechanisms. In his opinion, reputation-based security produces a much more trusted environment by knowing users' past actions. He argued that this approach would not create privacy issues if we change how we define privacy and what is sensitive in an appropriate way. He raised the interesting question: "Do we have a society that is sensitive to and understands what security is all about?" He insisted that unless we address societal and social issues related to security, the technology alone is not sufficient to protect our systems. We need to create a society aware of security and create an environment for students to learn computing "safely". This will lead us to embed safe computing into day- to-day life. He called for action to have national approach to security and law enforcement. He cited that it is utterly inappropriate to have data breach notification on a state-by- state basis. He also called for action to create an information-based economy where all entities share information about attacks and to have information-centric approach for security. He mentioned that Symantec is already sharing threat information with other companies, but federal agencies are hardly sharing any threat information. We need greater collaboration between public and private partnerships.

Symposium Summary: Fireside Chat

A panel summary by Utsav Mittal. Panel Members: * Eugene H. Spafford, CERIAS * Ron Ritchey, IATAC * John Thompson, Symantec It’s an enlightening experience to listen to some of the infosec industry’s most respected and seasoned professionals sitting around a table to discuss information security. This time it was Eugene Spafford , John Thompson and Ron Ritchey. The venue was Lawson computer science building. The event was a fireside chat as a part of the CERIAS 10th Annual Security Symposium. Eugene Spafford started the talk by stating that security is a continuous process not a goal. He compared security with naval patrolling. Spaf said that security is all about managing and reducing risks on a continuous basis. According to him, nowadays a lot of stress is placed on data leakage. This is undoubtedly one of the major concerns today, but it should not be the only concern. When people are focused more on data leakage instead of addressing the core of the problem, which is in the insecure design of the systems, they get attacked which gives rises to an array of problems. He further added that the amount of losses in cyber attacks are equal to losses incurred in hurricane Katrina. Not much is being done to address this problem. This is partly due to the fact that losses in cyber attacks, except a few major ones, occur in small amounts which aggregate to a huge sum. With regards to the recent economic downturn, Spaf commented that many companies are cutting down on the budget of security, which is a huge mistake. According to Spaf, security is an invisible but vital function, whose real presence and importance is not felt unless an attack occurs and the assets are not protected. Ron Ritchey stressed upon the issues of data and information theft. He said that the American economy is more of a design-based economy. Many cutting edge products are researched and designed in the US by American companies. These products are then manufactured in China, India and other countries. The fact that the US is a design economy further encompasses the importance of information security for US companies and the need to protect their intellectual property and other information assets. He said that attacks are getting more sophisticated and targeted. Malware is getting carefully social engineered. He also pointed out there is a need to move from signature-based malware detection to behavior-based detection. John Thomson arrived late as his jet was not allowed to land at the Purdue airport due to high winds. John introduced himself in a cool way as a CEO of a ‘little’ company named Symantec in Cupertino. Symantec is a global leader in providing security, storage and systems management solutions; it is one of the world’s largest software companies with more than 17,500 employees in more than 40 countries. John gave some very interesting statistics about the information security and attack scene these days. John said that about 10 years ago when he joined Symantec, Symantec received about five new attack signatures each day. Currently, this number stands about 15000 new signatures each day with an average attack affecting only 15 machines. He further added that the attack vectors change every 18-24 months, and new techniques and technologies are being used extensively by criminals to come out with new and challenging attacks. He mentioned that attacks today are highly targeted, intelligently socially engineered, are more focused on covertly stealing information from a victim's computer, and silently covering its tracks. He admitted that due to increasing sophistication and complexity of attacks, it is getting more difficult to rely solely on signature-based attack detection. He stressed the importance on behavior-based detection techniques. With regards to the preparedness of government and law enforcement, he said that law enforcement is not skilled enough to deal with these kind of cyber attacks. He said that in the physical world people have natural instincts against dangers. This instinct needs to be developed for the cyber world, which can be just as dangerous if not more so.

Blog Archive

Get Your Degree with CERIAS