Wednesday, March 31, 2010
- David Bell, Retired, Co-author Bell-La Padula Security Model
- Joe Pekny, Purdue University
- Kenneth Brancik, Northrop Grumman
- Petros Mouchtaris, Telcordia
Summary by Utsav Mittal
The panel was started by Petros Mouchtaris. He said that applying for funding is not that bad although the researcher gets a lot of rejections, but then also once the funding comes through it gives the researcher a lot of control about the areas he wants to work in. He said in the last 10 years most of their funding came from DARPA, initially the funding was for long-term small projects. He said that a smaller, long-term project gives more time to foster basic research about abstract ideas.
Joe Pekny, who has worked in Discovery park for about 10 years, said that the fundamental principle about generating funding is about that “Research follows impact.” He said that difference between getting and not getting funding is between the ability of the researcher to relate his potential and ability to provide impact. He also talked about the research opportunities in electronic medical records and about privacy issues in videos surveillance that is widely used.
He mentioned some tactics that help in order to monetize the research impact:
Leverage: He mentioned that everyone wants a big grant which runs long, but that is not always possible, so the researcher should leverage whatever opportunities that he has to have the biggest advantage.
Interdisciplinary: He said that this is important, as many problems that we face today are of a complex nature and no single idea can crack the problem, so different smart minds from different areas should work on it.
Minimalistic: Joe said that a minimalistic team should be assembled in order to crack the problem, there should not be too many people working on the project.
Relationships: Joe stressed the importance of fostering long standing relationships for generating funding.
Entrepreneurship: Joe mentioned that money never comes in the form that a person wants it to, so a researcher should have the spirit of entrepreneurship.
Operations v. Philanthropy: He meant that if a organization thinks that the researcher has the potential to solve an operations problem then it would shell out billions and fund it. On the other hand if they do not believe in the potential then they may give money as philanthropy.
Vision: Joe said that an enduring, fundamental over arching vision is needed for a researcher to be successful. A researcher should have creativity and innovation is every situation.
Kenneth Brancik shared his experiences about research funding in the last 30 years. He related his life experience and its help in increasing his “situational awareness.”” He said that technology is an enabler for business. He said we should think out of the box and be aware about the “situational awareness” related to cyber security. He said that a researcher, in order to understand the complex cyber security problems, should:
- Think out of the box
- Understand the business impact related to it.
- Use a wide angle lens to look at the picture.
David Bell started his talk by quoting Mark Twain and about people being lost in “Power Point Age” which cracked the audience up. David shared his experiences that he had working with ARPA and other federal agencies. He also mentioned about various projects like “Blacker.” He mentioned that in the earlier research was “Tethered research.” People were not very sure what they were working on, all they knew was that they are working on some advanced technology. His current take on federal funding was that it has dropped from 1.3% to 1%, and a lot needs to be done in the area of cyber security.
Wednesday, March 31, 2010
Summary by Robert Winkworth
“Everything I Needed to Know About Security I Learned in 1974”
Security luminary David Bell concluded this year’s Information Security Symposium with a lecture in which he argued that while the speed and size of computers has changed greatly across the decades, the principles underlying the issue of security have been remarkably constant.
With the exception of one noted MULTICS covert channel hack, the speaker asserted no fundamentally new innovation in computer security appeared from 1974 until 2005 (when he retired.) Dr. Bell had done a great deal of conceptual modeling, particularly near the beginning of his career. This, he explained, influenced his later work in security. In 1971, Bell, having read many classic MULTICS papers, felt even then that “all the good stuff” had already been done and made public. He recalled, with some amusement, that government facilities did not always share his awareness of these facts. Material freely available in research libraries, when cited in military security reports, often becomes classified as though somehow it might be made secret anew.
Commenting on the 1972 Anderson Report, Dr. Bell noted that a core collection of only about a dozen critical infiltration tactics proved successful in almost every documented penetration test. Clearly by better abstracting these procedures into general categories of attack we could better understand and predict them. So, Bell was called to produce a mathematical model of computer security, but no other details of his assignment were specified. This, he explained, turns the technical process of testing and setting conditions in the machine into a cultural process of negotiating policies. “Security” is not meaningful until defined. Likewise, threats to security must be discussed before we can discuss their remedies. General principles of a security model are not useful until somehow applied, and Bell prefers to see these concrete examples before signing off on a policy, however academically sound it may seem.
Along with Len La Padula, David Bell is probably most widely recognized for his contribution to the Bell-La Padula Model of secure systems. This widely influential set of conceptual tools appears frequently in the fundamentals of IA curricula at Purdue and probably throughout the world.
Our host was critical of those that see security as a personnel problem, noting that this approach fails to recognize the technical weaknesses that remain regardless of the people involved. And coordinating the technology is possible; Bell shows us computer systems that have never suffered a documented breach and never required a security patch. Unfortunately, the process of replacing an existing infrastructure is difficult, particularly for an entrenched bureaucracy, so the challenge facing many security modelers is producing a plan that outlines not only the destination but all the intermediary steps necessary to transform an existing system to one that approaches the level of security desired.
Many evaluators are assigned to networks the technology of which they cannot explain. Since they cannot articulate an effective policy for interactions between such a network and its trusted neighbors, a common reaction to this is to simply isolate them. As internetworking becomes pervasive, however, this cannot remain a practical strategy. Networks must be connected, but such connections introduce weaknesses if they are not thoroughly documented and regulated. How we can possibly manage the explosive complexity of internetworks remains a daunting question.
“We are not safe and secure today,” concludes our eminent guest. Those that claim otherwise are “either misinformed or lying.” Bell called upon us to implement more of the sound ideas in information assurance that hitherto have existed only as concept, and to fully acknowledge the extent to which models such as BLP have not been fully embodied.
Gene Spafford was on hand for today’s session, and asked for Dr. Bell’s comments on the software solutions of Rogers and Green Hills (two of the best-rated security platforms.) Bell found both quite sound. He was concerned, however, that neither had achieved the market “traction” that he would like to see. He provided some examples of how each could be more effectively introduced to companies that might use them in live networks.
As of March 31, 2010, the media presented in this lecture is available.
Wednesday, March 31, 2010
Summary by Gaspar Modelo-Howard
Day two opened with a keynote from Under Secretary Beers, who has had a long and interesting career of over 34 years, including military service and working as staff member for the National Security Council, under four U.S. Presidents. During his talk, he provided an introduction of the National Protection and Programs Directorate (NPPD) and DHS, discussed the importance and role of cyber security to protect the overall security of the United States, how DHS is continually evolving to meet the changing landscape and its mission, and current challenges and problems faced by NPPD.
Under Secretary Beers began with a discussion of the responsibilities of DHS and NPPD in particular. DHS has five goals or missions, listed here in no particular order: (1) counterterrorism, (2) securing U.S. borders, (3) immigration, (4) response to disasters, and (5) cyber security. This last goal refers to protecting cyberspace for civilian side of government and working with private sector to achieve physical Critical Information Infrastructure Protection (CIIP).
DHS is a pretty new department, formed in late 2002, so they are currently embarking on the transformation of its workforce. Main reason is a number of professional disciplines were brought together to start the Department but there were at time very few professionals to start DHS. So it is an evolving organization. Currently, NPPD has equal number of private contractors and federal employees working in the Directorate but there are several initiatives to fill more permanent positions. In terms of cyber security, the Department is looking to hire 1,000 people in cyber security in the next 3 years. They also expect to increase NPPD cyber security workforce to 260 by end of FY 2010.
Under Secretary Beers mentioned the difficulty faced when hiring cyber security specialists is that academic institutions do not currently produce enough graduates to meet the federal demand. Such statement considers that not all of the needs are for pure technical positions. Much to the surprise and amusement of the audience, the Under Secretary mentioned there are not enough lawyers in DHS. It takes a long time for DHS leaders to get legal advice on some topics because there are more questions than the lawyers can answer. Some of this would also be rectified by having better laws relating to cyber security.
Generally speaking, DHS and NPPD in particular, are looking to draw knowledge and experience from math, science and cyber security communities to build a strong federal department. DHS objective is to forge stronger links with educational institutions such as Purdue University, to better prepare itself to deal with cyber security matters.
During his presentation, Under Secretary Beers made an important point to help define the national cyber security strategy: 85% of cyberspace in U.S. exists outside the government. That is why the Directorate works closely with private sector. For example, the Office of Infrastructure Protection (IP) takes 18 critical sectors of the American economy (water, power, finance, etc.) and work with them to develop security plans (standards, strategies, best practices) and improve preparedness to respond to emergencies. Mr. Beers also stressed the role cyber security plays within DHS, as it is part of every other part. Cyber security works as a cross sector, for example between the communication and information sectors.
The Under Secretary noted that cyber threats are increasing on a daily basis and they also include physical attacks, because of the potential impact they can have in cyberspace. He shared two examples: (1) a bond trading company which had to evacuate during the first World Trade Center attack of 1993 and (2) the train derailment and fire in Baltimore, 2001. In the first story, the investment company had to evacuate the World Trade Center but did not backup systems off-site. It took a presidential order to allow them to re-enter the building since the fire marshal had prohibited anyone from doing so. In the train story, the fire disrupted communication links going thru the same tunnel where the disaster occurred. Such cables were major Internet links that slowed down service around the US.
NPDD cyber security daily operations include monitoring of attacks, protecting the .gov domain and monitoring Internet connections from/to government networks. US-CERT, the cyber security operational arm within NPDD, uses the Einstein intrusion detection program to work on these responsibilities. (I think it was cool that he mentioned Einstein as usually high-ranking U.S. Government officials avoid such topics). Mr. Beers also noted that under President Obama’s cyber security 60-day review, DHS had to create a Computer Emergency Response Team (CERT) plan to deal with cyber security threats and crisis. It has been done and involved government at different levels (federal, state, local) and private sector. Also, DHS opened last October the National Cyber security and Communications Integration Center to improve national efforts to address threats and incidents affecting U.S. critical cyber infrastructure.
To finish his presentation, the Under Secretary talked about several of the current and future cyber security challenges faced by DHS. First, they are currently working on developing systems that make it possible for different cyber security players to share information. This is a common problem when requesting or managing information from different sources, for example the private sector, because such information is highly sensitive to its owner. Second, DHS is also increasingly responsible for cyber security awareness and outreach initiatives. They are working with academic institutions to foster and identify potential government employees. Third, in terms of global involvement, US-CERT is partnering with similar institutions in other countries to work on international incidents and to create stronger ties. DHS is fully aware of the interconnectivity of networks, regardless of physical location. It actively participates in the annual Meridian Conference for international CIIP collaboration and invites representatives of foreign countries to their biennial Cyber Storm exercises.
In the Q&A session, a member of the audience asked Mr. Beers if he could prioritize DHS cyber security needs in terms of the human capital. This is important as cyber security is an interdisciplinary field and there is need for professionals with technical and non-technical backgrounds. Mr. Beers listed three needs: (1) people with computer science background to operate the cyber security centers; (2) people with system design and administration skills; (3) people with business background to deal with contracting issues and proficiency to understand technical requirements. This last group is important as government has a responsibility to define as clear and specific as possible the requirements and objectives so other sectors can determine how to comply. He then mentioned that government might have to start training centers as there are not enough graduates coming from college.
As a follow up question to his comment on cyber security savvy lawyers, he was asked if real problem is that U.S. does not have the appropriate laws to protect its cyber infrastructure and also if DHS is advocating for new legal frameworks. Mr. Beers agreed that a better legal framework is required and DHS is indeed advocating for this to happen. In a later question, he also pointed out that legal and cyber security communities need to further discuss issues affecting both sides and such exchanges should also happen outside the government (because of restrictions a federal employee might have by law).
The next two questions were about international efforts taken by DHS, citing the United Nations is working on developing cyber security laws and best practices. The Under Secretary mentioned that DHS cannot work at international level and that time has come for State Department to step up.
A question then was made regarding the difficulties when physical and cyber security communities interact. Mr. Beers noted it is a recurring but expected problem when working with entities from public and private sectors. Sometimes they find cases where both exist under one directorate, but in general this is not the case and it is part of the evolution of security.
A member of audience asked about briefing on current and future strategies with U.S. Cyber Command and NSA. The Under Secretary mentioned that major elements of collaboration are still under development. There are discussions on having DHS deputy and employees at Cyber Command and NSA and vice versa.
A final question was made on comparing costs of training employees in cyber security with costs of scholarship, suggesting the second option might be cheaper. Therefore there might be an incentive to increase number of scholarships. Mr. Beers agreed to the suggestion and said DHS is looking into additional opportunities to fund students/institutions but was also quick to point out that not every cyber security professional has to come from an academic setting.
Overall, it was an interesting talk by the Honorable Beers, providing an overview of the structure, mission and challenges faced by NPPD and DHS. He stressed out the importance of cyber security as part of the primary mission of the Department and the relevance of working with different partners to successfully achieve the mission.
Tuesday, March 30, 2010
- Mike McConnell, Booz Allen Hamilton
- Rand Beers, DHS
- Eugene H. Spafford, CERIAS
Summary by Derril Lucci
The fireside chat saw Admiral John McConnell, the Honorable Rand Beers, and Professor Eugene Spafford discuss some of the issues in security today. One of the first topics covered was how technology will change business and society. Admiral McConnell made a point to mention that once every 50 years, a new technology comes along that revolutionizes the way in which things are done. Among the examples included the gin mill and the textile industry. Another topic that was discussed was the need for a new internet. What is meant by this is a need for an internet that can go through a trusted third party. This new idea, they believe, will make for a safer internet. This lead to the debate about the innovation of cyberspace versus security. Security can be viewed as a restriction to the innovation of cyberspace because it is a tradeoff between standards and regulation. Admiral McConnell also discussed a potential threat to our banks. He said that every day, $7 trillion dollars is moved by two banks in New York City. If these transmissions are ever interrupted, coupled with a well timed terrorist attack, it could topple both the U.S. banking and the global banking industry. This is why both Admiral McConnell and Secretary Beers have lobbied for action by the government to set up a plan to prevent this. However, they both stressed that the U.S. government has a history of dragging its feet when it comes to this matter, and they feel that the U.S. will not do anything until the event has already occurred. Furthermore, Secretary Beers called for academic institutions to come together and decide where we want to go, as a Network/Cyber security community. Admiral McConnell said that it is up to future generations to devise schemes to lower the risk of attacks by those who wish to change the world order.
Tuesday, March 30, 2010
- Nicolas Christin, Carnegie Mellon University
- Cassio Goldschmidt, Symantec Corporation
- Aaron Massey, North Carolina State University
- Melissa Dark, Purdue University
Summary by Preeti Rao
March 31, 2010, Tuesday afternoon’s panel discussion at the Eleventh Annual CERIAS Symposium was on Information Security Ethics. The panel consisted of four pioneers from academia and industry - Nicolas Christin from Carnegie Mellon University, Cassio Goldschmidt from Symantec Corporation, Aaron Massey from North Carolina State University and Melissa Dark from Purdue University.
Melissa Dark introduced the panel and put forth the thought that Information Security Ethics is a really messy topic because it involves a variety of stakeholders. Identifying all the stakeholders, their competing interests and balancing the competing interests is not an easy trade-off. There are a number of incentives and disincentives to be considered. Information security ethics is interesting when discussed with respect to certain scenarios and the panel chose to do that.
The first presentation was from Nicolas Christin and he presented on Peer-to-Peer Networks, Incentives and Ethics.
He started off by talking about Peer-to-peer (P2P) networks in general, their interdisciplinary nature, their benefits and costs. He quoted that P2P traffic is a very sizable amount of load and that 30 to 70% of internet traffic is from P2P networks. They carry a bad reputation because of copyrighted materials dissemination. But they have numerous benefits too ñ software distributors save on infrastructure by distributing free and proprietary software to legitimate users through P2P networks. Another advantage is in censorship resilience.
Christin identified five stakeholders in P2P networks and discussed about their ethical dilemmas and competing interests. End users, content providers or copyright holders, electronics manufacturers, software developers and internet service providers (ISPs) were the five stakeholders he talked about. While end users tend to download content for free, content providers or copyright holders are worried about unauthorized replication of their content. Electronic manufacturers benefit from digital media portability on P2P networks — electronics like iPods would not have been this successful if people did not get music for free or for very low cost. Software developers potentially benefit from increased P2P use. ISPs have interesting ethical dilemmas. While ISPs benefit due to increased bandwidth usage from users downloading content, a number of users are into copyright infringement — downloading content for free through P2P networks through the bandwidth provided by these ISPs. Sometimes ISPs assist companies of content providers. He quoted a very good example of Comcast. Is it ethical to download TV shows using Comcast’s Internet, or watch the TV shows using Comcast’s cable TV service?
He summarized the competing interests and ethical dilemmas of the stakeholders identified on P2P networks as end users producing and downloading infringing content, content industry poisoning P2P networks, content industry launching Denial of service attacks on P2P hosts, ISPs advertising access to movies, promising users that they will get access to the movies, and then filtering out BitTorrent traffic, electronics manufacturers advertising ripping and copying capabilities of the devices.
He left the audience with a set of intriguing questions. Is downloading content ethical or unethical? How do we decide what is ethical and unethical in Information Security? What are the criteria to be applied to make this decision? Are the decisions ever ethically justified? The bottom line is the unclear set of incentives.
The second presentation was on Responsibility for the Harm and Risk of Software Security Flaws by Cassio Goldschmidt.
He identified five stakeholders in analyzing the situation of software security flaws. The stakeholders were Independent Software Vendors (ISVs), Users, Government, Software Vulnerabilities and Security Researchers.
He quoted Microsoft’s example as an ISV and how users always blame ISVs for faulty software. For software industries, the weakest links are software developers and software testers. ISVs are doing a lot to build secure software they have started training classes to teach how to write secure code and how to secure every stage of SDLC and test life cycle. But, software by nature is vulnerable, no matter what. Users buy software because of its features; when a user is ready to buy software there is no way he can make out whether that software is secure. Goldschmidt argued that managing software security is very difficult when one cannot compare two pieces of software are more secure; hence we cannot expect users to buy and use “secure software”. There are many non-technical users who do not know the importance software or system security. Users definitely have something to do with the software vulnerabilities.
He talked about security researchers and vulnerability disclosures. There are conflicting interests and possible risks in security researchers disclosing software vulnerabilities. Before one does a full disclosure of vulnerabilities, one needs to think about how people and media would take advantage of it. He quoted an example of the concept of Microsoft’s “Patch Tuesday” and the following “Exploit Wednesday”. Sometimes software industries buy products from companies because of strategic partnerships, long term relations, money, etc. The decision is not always based on security.
Government has a role to play in promoting software security. But if the government enacts laws to enforce software security, there will be serious financial issues for the ISVs. For example, software development process would become very expensive for start-ups. He concluded that enacting laws for software security can be hard.
He summarized — software is dynamic. People have yet to understand the meaning of software. Some call it a product. Some call it a service. Some even call it free speech because it has a language and associated grammar. The problem of software security is very complex. It needs attention and awareness.
The third presentation was from Aaron Massey on Behavioral Advertising Ethics.
Behavioral advertising which targets custom-made advertisements to users based on their behavior profiles uses technologies like cookies, web bugs and deep packet inspection. Massey opined that Behavioral Advertising Ethics is interesting and overlaps with Advertising, Privacy and Technology domains. He quoted examples of some ethical dilemmas associated with these domains:
Advertising: Is it ethical to target ads based on user’s profile/history For Example: a door salesman posing questions to customer to know more about their preferences and suggesting products based on gathered information.
Privacy: For example, a Facebook program which tracked user A’s online shopping history and displayed ads on user B’s (friend of user A) homepage suggesting to buy the product bought by user A. Is this a probable privacy breach for user A?
Technology: Where does the ethical value lie? And, is it in the technology itself? Is it in the use of technology, or is it in the design? As an example, take a hammer. It can be used in a constructive or destructive way and the design does not restrict the purpose of usage.
Considering these questions when building a behavioral advertising technology, is there a way we can make it secure without compromising the utility of the technology?
Melissa Dark summed up the panel presentations considering the three keys for information security ethics: the stakeholders, their competing interests and tradeoffs, the incentives and disincentives. She mentioned that incentives and disincentives have been long standing norms and expectations. We need to think about how these norms and expectations affect ethics, how our mindsets affect the larger ethical debate. She opened the floor for questions.
Questions and Discussions
Question 1: Often with online shopping and ethics, users usually do not have many options. Either you buy the product or leave it. For example, the Facebook scenario discussed earlier. In such situations, if you disagree with the ethics then how can you affect the changes? Usually most companies just have ethics externally posed on them.
Melissa Dark: Masses can make use of consumerism and market forces. She mentioned that there are 45 Data Breach Disclosure state laws, but no single federal law in the US for handling data breach disclosures. The usage of right language to talk about information security is very important.
Victor Raskin: Supported Melissa on that and said the language, the framework used to talk about information security is very important.
Eugene Spafford: Awareness is equally important for software security. Our current mission should be to make security visible.
Audience: Informal collective action (example - blogosphere) is very powerful, can be used as a weapon against unethical actions.
Aaron Massey: Danger and the slippery slope is the connotation in ethics.
Question 2: What are the roles of users, government in realizing information security? In Australia, ISPs are now restricting access to end users on certain resources because a recent law put liability on the ISPs to take corrective action; the end users are just notified.
Nicolas Christin: There are similar laws on P2P networks. But again, managing the tradeoff between ISPs and users is critical. Users can easily conceal their actions and ISPs have to make a decision on restricting their users. Ethical and legal dilemmas are happening because the legal scholars who usually write the laws usually have no technology background.
Eugene Spafford: It is hard to strike the right balance and create good laws.
Question 3: Educational institutions are not doing a good job teaching how to write secure software. What should an institution do to give good security education?
Melissa Dark: Public institutions have a lot of masters to serve. They take tax payer money and are under many obligations. Yet security education curriculum is being modified and improved constantly. There has been tremendous growth in the past decade. There is still a lot more to be done for security education.
Audience: College education is just once, but industry education and training needs to be constantly revised.
Nicolas Christin: Security education: should it be industry driven or college education driven? In college education, the main goal is to train students to get good jobs. University respond to market demands. Selling security and security education is hard. Knowing how to write secure code needs lot of training and experience. For a new graduate the most important thing is to secure a job, need not necessarily be a secure software coding job.
Aaron Massey: Even before security education: what is security? How do you measure security? Should you concentrate on secure programming, testing or design?
Eugene Spafford: Purdue CERIAS is doing a great job in giving security education. But still, lot of awareness is needed.
Question 4: What is ethical software or ethical coding? Does the society have a role to play in making the society ethical?
Aaron Massey: Society is addressing ethical questions. For example, the FTC is holding workshops on how to treat privacy online. There is no single solution yet.
Question 5: What are the best practices from other disciplines that can be adopted into Infosec ethics? Do other disciplines have a generic framework? Aaron Massey: Healthcare legislations, HIPAA are evolving. Generic framework is a good domain to look at. Investigations are on in this regard. Professional code of ethics is as applied to a profession. But Information security profession, its demands and roles are not yet clearly defined.
Question 6: How does ethics depend on the perception of truth? How can advertising be a win-win situation, if advertising is just informational and not manipulative? Does anyone read the privacy policies where information is there, but not consumable?
Audience: An idea based on agricultural domain: suppose companies identify themselves as data-collection free companies and certify themselves as ones who do not collect information about people, would that help?
- August, 2017
- April, 2017
- March, 2017
- November, 2016
- October, 2016
- July, 2016
- June, 2016
- March, 2016
- December, 2015
- October, 2015
- August, 2015
- June, 2015
- May, 2015
- April, 2015
- September, 2014
- July, 2014
- May, 2014
- April, 2014
- March, 2014
- February, 2014
- January, 2014
- November, 2013
- October, 2013
- September, 2013
- June, 2013
- April, 2013
- February, 2013
- January, 2013
- December, 2012
- April, 2012
- February, 2012
- October, 2011
- July, 2011
- June, 2011
- May, 2011
- April, 2011
- March, 2011
- September, 2010
- June, 2010
- April, 2010
- March, 2010
- February, 2010
- December, 2009
- November, 2009
- October, 2009
- September, 2009
- August, 2009
- July, 2009
- June, 2009
- May, 2009