The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

Panel #2: Infosec Ethics (Symposium Summary)

Share:

Tuesday, March 30, 2010

Panel Members:

  • Nicolas Christin, Carnegie Mellon University
  • Cassio Goldschmidt, Symantec Corporation
  • Aaron Massey, North Carolina State University
  • Melissa Dark, Purdue University

Summary by Preeti Rao

March 31, 2010, Tuesday afternoon’s panel discussion at the Eleventh Annual CERIAS Symposium was on Information Security Ethics. The panel consisted of four pioneers from academia and industry - Nicolas Christin from Carnegie Mellon University, Cassio Goldschmidt from Symantec Corporation, Aaron Massey from North Carolina State University and Melissa Dark from Purdue University.

Melissa Dark introduced the panel and put forth the thought that Information Security Ethics is a really messy topic because it involves a variety of stakeholders. Identifying all the stakeholders, their competing interests and balancing the competing interests is not an easy trade-off. There are a number of incentives and disincentives to be considered. Information security ethics is interesting when discussed with respect to certain scenarios and the panel chose to do that.

The first presentation was from Nicolas Christin and he presented on Peer-to-Peer Networks, Incentives and Ethics.

He started off by talking about Peer-to-peer (P2P) networks in general, their interdisciplinary nature, their benefits and costs. He quoted that P2P traffic is a very sizable amount of load and that 30 to 70% of internet traffic is from P2P networks. They carry a bad reputation because of copyrighted materials dissemination. But they have numerous benefits too ñ software distributors save on infrastructure by distributing free and proprietary software to legitimate users through P2P networks. Another advantage is in censorship resilience.

Christin identified five stakeholders in P2P networks and discussed about their ethical dilemmas and competing interests. End users, content providers or copyright holders, electronics manufacturers, software developers and internet service providers (ISPs) were the five stakeholders he talked about. While end users tend to download content for free, content providers or copyright holders are worried about unauthorized replication of their content. Electronic manufacturers benefit from digital media portability on P2P networks — electronics like iPods would not have been this successful if people did not get music for free or for very low cost. Software developers potentially benefit from increased P2P use. ISPs have interesting ethical dilemmas. While ISPs benefit due to increased bandwidth usage from users downloading content, a number of users are into copyright infringement — downloading content for free through P2P networks through the bandwidth provided by these ISPs. Sometimes ISPs assist companies of content providers. He quoted a very good example of Comcast. Is it ethical to download TV shows using Comcast’s Internet, or watch the TV shows using Comcast’s cable TV service?

He summarized the competing interests and ethical dilemmas of the stakeholders identified on P2P networks as end users producing and downloading infringing content, content industry poisoning P2P networks, content industry launching Denial of service attacks on P2P hosts, ISPs advertising access to movies, promising users that they will get access to the movies, and then filtering out BitTorrent traffic, electronics manufacturers advertising ripping and copying capabilities of the devices.

He left the audience with a set of intriguing questions. Is downloading content ethical or unethical? How do we decide what is ethical and unethical in Information Security? What are the criteria to be applied to make this decision? Are the decisions ever ethically justified? The bottom line is the unclear set of incentives.

The second presentation was on Responsibility for the Harm and Risk of Software Security Flaws by Cassio Goldschmidt.

He identified five stakeholders in analyzing the situation of software security flaws. The stakeholders were Independent Software Vendors (ISVs), Users, Government, Software Vulnerabilities and Security Researchers.

He quoted Microsoft’s example as an ISV and how users always blame ISVs for faulty software. For software industries, the weakest links are software developers and software testers. ISVs are doing a lot to build secure software they have started training classes to teach how to write secure code and how to secure every stage of SDLC and test life cycle. But, software by nature is vulnerable, no matter what. Users buy software because of its features; when a user is ready to buy software there is no way he can make out whether that software is secure. Goldschmidt argued that managing software security is very difficult when one cannot compare two pieces of software are more secure; hence we cannot expect users to buy and use “secure software”. There are many non-technical users who do not know the importance software or system security. Users definitely have something to do with the software vulnerabilities.

He talked about security researchers and vulnerability disclosures. There are conflicting interests and possible risks in security researchers disclosing software vulnerabilities. Before one does a full disclosure of vulnerabilities, one needs to think about how people and media would take advantage of it. He quoted an example of the concept of Microsoft’s “Patch Tuesday” and the following “Exploit Wednesday”. Sometimes software industries buy products from companies because of strategic partnerships, long term relations, money, etc. The decision is not always based on security.

Government has a role to play in promoting software security. But if the government enacts laws to enforce software security, there will be serious financial issues for the ISVs. For example, software development process would become very expensive for start-ups. He concluded that enacting laws for software security can be hard.

He summarized — software is dynamic. People have yet to understand the meaning of software. Some call it a product. Some call it a service. Some even call it free speech because it has a language and associated grammar. The problem of software security is very complex. It needs attention and awareness.

The third presentation was from Aaron Massey on Behavioral Advertising Ethics.

Behavioral advertising which targets custom-made advertisements to users based on their behavior profiles uses technologies like cookies, web bugs and deep packet inspection. Massey opined that Behavioral Advertising Ethics is interesting and overlaps with Advertising, Privacy and Technology domains. He quoted examples of some ethical dilemmas associated with these domains:

  • Advertising: Is it ethical to target ads based on user’s profile/history For Example: a door salesman posing questions to customer to know more about their preferences and suggesting products based on gathered information.

  • Privacy: For example, a Facebook program which tracked user A’s online shopping history and displayed ads on user B’s (friend of user A) homepage suggesting to buy the product bought by user A. Is this a probable privacy breach for user A?

  • Technology: Where does the ethical value lie? And, is it in the technology itself? Is it in the use of technology, or is it in the design? As an example, take a hammer. It can be used in a constructive or destructive way and the design does not restrict the purpose of usage.

Considering these questions when building a behavioral advertising technology, is there a way we can make it secure without compromising the utility of the technology?

Melissa Dark summed up the panel presentations considering the three keys for information security ethics: the stakeholders, their competing interests and tradeoffs, the incentives and disincentives. She mentioned that incentives and disincentives have been long standing norms and expectations. We need to think about how these norms and expectations affect ethics, how our mindsets affect the larger ethical debate. She opened the floor for questions.

Questions and Discussions

Question 1: Often with online shopping and ethics, users usually do not have many options. Either you buy the product or leave it. For example, the Facebook scenario discussed earlier. In such situations, if you disagree with the ethics then how can you affect the changes? Usually most companies just have ethics externally posed on them.

Aaron Massey: There are privacy policies that are in place and FTC enforces these privacy policies. If a company violates its privacy policy, though as an individual you cannot sue the company, you can file a complaint to the FTC. FTC would review company’s business practices and take necessary actions. Companies like Facebook, Google work with FTC right from the beginning to get everything right.

Melissa Dark: Masses can make use of consumerism and market forces. She mentioned that there are 45 Data Breach Disclosure state laws, but no single federal law in the US for handling data breach disclosures. The usage of right language to talk about information security is very important.

Victor Raskin: Supported Melissa on that and said the language, the framework used to talk about information security is very important.

Eugene Spafford: Awareness is equally important for software security. Our current mission should be to make security visible.

Audience: Informal collective action (example - blogosphere) is very powerful, can be used as a weapon against unethical actions.

Aaron Massey: Danger and the slippery slope is the connotation in ethics.

Question 2: What are the roles of users, government in realizing information security? In Australia, ISPs are now restricting access to end users on certain resources because a recent law put liability on the ISPs to take corrective action; the end users are just notified.

Nicolas Christin: There are similar laws on P2P networks. But again, managing the tradeoff between ISPs and users is critical. Users can easily conceal their actions and ISPs have to make a decision on restricting their users. Ethical and legal dilemmas are happening because the legal scholars who usually write the laws usually have no technology background.

Eugene Spafford: It is hard to strike the right balance and create good laws.

Question 3: Educational institutions are not doing a good job teaching how to write secure software. What should an institution do to give good security education?

Melissa Dark: Public institutions have a lot of masters to serve. They take tax payer money and are under many obligations. Yet security education curriculum is being modified and improved constantly. There has been tremendous growth in the past decade. There is still a lot more to be done for security education.

Audience: College education is just once, but industry education and training needs to be constantly revised.

Nicolas Christin: Security education: should it be industry driven or college education driven? In college education, the main goal is to train students to get good jobs. University respond to market demands. Selling security and security education is hard. Knowing how to write secure code needs lot of training and experience. For a new graduate the most important thing is to secure a job, need not necessarily be a secure software coding job.

Aaron Massey: Even before security education: what is security? How do you measure security? Should you concentrate on secure programming, testing or design?

Eugene Spafford: Purdue CERIAS is doing a great job in giving security education. But still, lot of awareness is needed.

Question 4: What is ethical software or ethical coding? Does the society have a role to play in making the society ethical?

Aaron Massey: Society is addressing ethical questions. For example, the FTC is holding workshops on how to treat privacy online. There is no single solution yet.

Question 5: What are the best practices from other disciplines that can be adopted into Infosec ethics? Do other disciplines have a generic framework? Aaron Massey: Healthcare legislations, HIPAA are evolving. Generic framework is a good domain to look at. Investigations are on in this regard. Professional code of ethics is as applied to a profession. But Information security profession, its demands and roles are not yet clearly defined.

Question 6: How does ethics depend on the perception of truth? How can advertising be a win-win situation, if advertising is just informational and not manipulative? Does anyone read the privacy policies where information is there, but not consumable?

Aaron Massey: Research is being done and people are coming up with Nutritional labels for privacy policies ñ an alternative way of understanding privacy policies instead of reading a lot of privacy policy text.

Audience: An idea based on agricultural domain: suppose companies identify themselves as data-collection free companies and certify themselves as ones who do not collect information about people, would that help?

Nicolas Christin: There are companies that produce privacy practices in machine readable form so that you do not have to read the whole document. Companies are trying different methods for privacy policy reading.

Comments

Leave a comment

Commenting is not available in this section entry.