This material is based upon work supported by the National Science Foundation under Grant No. 0113725. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

By: Victor Akatsa, Rachel Borhauer, Carl Botan, Jan-Jo Chen, Gregory Conti, Donald Costello, Melissa Dark, Peg Ertmer, Ahmad Ghafarian, Steven Huss-Lederma, Regina S. Lightfoot, Richard Lovely, Dee Medley, Kamesh Namduri, Srini Ramaswamy, Victor Raskin, Ramen Saheb, Frederick Schmitt, Eugene Spafford, Paul Thompson, Jan Wilms, Jesse Yu

The report is the product of a workshop entitled "Protecting Information in the Computer and Beyond." The workshop was organized by the Center for Education and Research in Information Assurance and Security at Purdue University and sponsored by the National Science Foundation, DUE project #0113725.

The goal of the project is to improve the instructional capacity of computer science and security faculty with respect to emerging sociological, ethical, linguistic, communication, and educational issues associated with information assurance and security. Information assurance and security is a ubiquitous and growing concern for many, if not all, administrative, industrial, academic, political, and economic entities and activities. As information technology advances reach more of the population, these security needs will expand. These entities are becoming increasingly more reliant upon information systems that are designed, developed, and deployed based upon their technical capabilities, such as convenience, speed, and power. These technical capabilities are made possible through scientific research and experimentation. While scientific discovery may be empirical in nature, the value of what it discovers is not. This value is based on fundamental beliefs as they are conceptualized and formulated in the disciplines of political science, sociology, philosophy, linguistics, communication, education, and others. Technological discoveries, coupled with the pervasive need for information assurance and security, makes addressing the implications of these values a critical priority. We face the immense challenge of educating the next generation of computer scientists, technicians, and users in security practices that are both effective and consonant with the values of the society in which they are to be used. Hence, the interdisciplinary focus adopted here is deemed critical at a time when technical capabilities have advanced more rapidly than consideration of the social effects.

To improve instructional capacity in these areas, this project focuses on faculty and curriculum development. We have developed an institute accommodating approximately 50 (25 faculty members per year for two years). A major outcome of the institute will be the development of a portfolio of educational materials by participating faculty for use in both their own and other educational programs with the help of subject matter experts. The resulting educational materials will be made available online to faculty outside the institute in various forms including suggested model curricula, suggested course outlines, and classroom exercises at several levels. This report details the output of the first year of the institute.

Twenty-two educators participated in the first year of the institute (see Appendix A for a list of participants and their institutions). Faculty with research and teaching experience in information assurance and security and the disciplines of communication, education, linguistics, and philosophy presented theory and research in their respective fields. The workshop topics included:

  • The capacity to do great good or great harm in professional life. Mindfulness of capabilities and responsibilities.
  • Relationships among the generation, flow, use, and control of information and the essence of organizations, as well as conceptions of self and identity.
  • Applications of ontology, connotation, intention, language use and abuse, deliberate concealment and accidental disclosure, encoding and decoding, and natural language processing on information security.
  • Natural language and meaning, meaning as information, and the implications of language and meaning on information security.
  • Communication as a social act and the role and purpose of information technology in enabling this/these social act(s).
  • How affecting communication has the potential to change the essence of organizations, as well as how we think about ethics.
  • Teaching through exposition and inquiry.
  • Curriculum development and integration.
Because there was an emphasis on curriculum development, sessions were planned for all participants to engage in analysis of the role and purpose of ethics, social and professional issues, and linguistics in the computer science curriculum. To create a framework for analysis, we used the ACM Computing Curr icula 2001 Steelman Draft. We analyzed, identified, and drafted a set of topics that would be appropriate for integration throughout the undergraduate computer sciences curriculum. It is important to note that the goal of our workshop was not to reach consensus among the group about what social and professional issues' topics should be included in a computer sciences curriculum. Furthermore, we were not attempting to prescribe how these topics should be included in the curriculum. Rather, what we set out to do was generate numerous ideas about how to integrate ethical, social, and linguistic issues into the computer science curriculum. Due to the fact that linguistics was the last topic on the workshop schedule, the group made less progress analyzing how to integrate linguistics into the undergraduate computer science curriculum. However, considerable progress was made in analyzing and identifying ways to integrate ethical and social issues into the undergraduate science curriculum. The following questions were used to facilitate the analysis process.

  1. What is ethically significant about computers and computing?
  2. What are ethical problems/issues in computing?
  3. What are social problems/issues in computing?
  4. What are the security and information protection issues in computing?
  5. How do we convey these issues (Items 1-4) to the next generation of software/system professionals to increase their awareness, establish a sense of responsibility, and, hopefully, impact (for the better) the quality of the products they produce?
In answering these questions, we aggregated the computing curriculum into three main areas: 1) Introductory Courses (Discrete Structures, Programming Fundamentals, Algorithms and Complexity), 2) Software and Programming (Programming Languages, Graphics and Visual Computing, Intelligent Systems, Software Engineering), and 3) Hardware and Systems (Architecture and Organization, Operating Systems, Human Computer Interaction, Information Management, Net-Centric Computing). Participants joined one of the three groups based on their teaching expertise and interest (see Appendix B for a list of participants by group). Each group generated a list of recommendations regarding how these topics could be integrated as modules throughout the undergraduate computer science curriculum (see Appendix C). The recommendations were then converted to a curriculum framework that includes learning objectives (Appendix D). The framework is organized by 1) cluster (introductory courses, software and programming, and hardware and systems), 2) issues within the cluster, and 3) learning objectives for each issue.

Based on the curriculum framework developed in the workshop, participants have signed up to author a number of instructional case studies and accompanying instructors' guides. The case studies will be 2-4 page instructional stories designed to help students uncover ethical and social issues in computer sciences. The instructor's guide will include facilitation questions and debriefing guidelines to assist instructors' effective utilization of the case studies. Case studies will be written and piloted in Spring 2002. It is anticipated that these case studies will be available for dissemination in the Fall of 2002.

Appendix A

Workshop Participants

Victor Akatsa Chicago State University
Rachel Borhauer United States Military Academy
Carl Botan Purdue University
Jan-Jo Chen Chicago State University
Gregory Conti United States Military Academy
Donald Costello University of Nebraska - Lincoln
Melissa Dark Purdue University
Peg Ertmer Purdue University
Ahmad Ghafarian North Georgia State College and University
Steven Huss-Lederman Beloit College
Regina S. Lightfoot Hood College
Richard Lovely John Jay College
Dee Medley Augusta State University
Kamesh Namduri Clark Atlanta University
Srini Ramaswamy Tennessee Tech University
Victor Raskin Purdue University
Ramen Saheb Georgia Southwestern University
Frederick Schmitt College of Marin
Eugene Spafford Purdue University
Paul Thompson Purdue University
Jan Wilms Union University
Jesse Yu College of St. Elizabeth

Appendix B

Working Groups Introductory Courses
Victor Akatsa Chicago State University
Rachel Borhauer United States Military Academy
Donald Costello University of Nebraska - Lincoln
Steven Huss-Lederman Beloit College
Dee Medley Augusta State University
Frederick Schmitt College of Marin

Software and Programming
Jan-Jo Chen Chicago State University
Gregory Conti United States Military Academy
Ahmad Ghafarian North Georgia State College and University
Kamesh Namduri Clark Atlanta University
Srini Ramaswamy Tennessee Tech University
Jesse Yu College of St. Elizabeth

Hardware and Systems
Regina S. Lightfoot Hood College
Richard Lovely John Jay College
Ramen Saheb Georgia Southwestern University
Jan Wilms Union University

The following individuals circulated throughout the groups:
Carl Botan Purdue University
Melissa Dark Purdue University
Peg Ertmer Purdue University
Victor Raskin Purdue University
Paul Thompson Purdue University

Appendix C

Introductory Courses (Discrete Structures, Programming Fundamentals, Algorithms and Complexity).

What is ethically significant?
  1. Convince students and make them aware of the ethical aspects of computing.
    1. Awareness of the role of ethics in decision making and professional practice (case study).
    2. Awareness of the consequences of ethical and unethical behavior.
    3. Awareness of the ACM code of ethics.
  2. Students should be able to identify the importance, complexity, consequences and obligations of computing professionals.
    1. Awareness of competing roles to their employers, themselves, society, and their profession.
    2. Identify stakeholders in an issue and our obligations to them.
    3. Identify ethical choices.
    4. Articulate ethical tradeoffs in a technical situation (case study).
  3. Students need to be aware of the impact of computers in society and the wide effects outside own profession.
    1. Describe positive and negative ways in which computers alter the social interactions of people.
    2. Identify global impacts on diverse cultures.
    3. Discuss implications of differences in access and utilization to computing resources.
  4. Professional obligations in producing a work product. Explain the role of:
    1. problem set definition and system design.
    2. honesty in analysis and feasibility.
    3. identification of risks and liabilities.
    4. testing in professional practice.
    5. team and individual responsibilities.
  5. Identify the concrete actions that professionals can take when faced with ethical dilemmas.
What are ethical problems/issues?
  1. Writing obscure or undocumented code.
  2. Conflict of interest/dilemmas between individual needs, organizational goals, and societal implications.
  3. Vulnerabilities of advanced technological societies.
  4. When and why and the process to release software systems.
  5. Scope and meaning of correctness in software.
  6. Building reasonable safeguards.
  7. Reasonability of personal assessment.
  8. Addressing personal responsibility within team environments.
  9. Intellectual property.
Software and Programming (Programming Languages, Graphics and Visual Computing, Intelligent Systems, Software Engineering).

What are ethical and social issues in software and programming?
  1. The need for accountability.
    1. Students will take ownership and being able to stand up/speak out, give credit/acknowledgement to other people's work, protect property (do not facilitate cheating, do not let others have easy access to your work), come up with group name (do not communicate by members names in the group, come up with a code name to protect your group's identity).
    2. Students will be able to specify strengths and weaknesses in relevant professional codes as expressions of professionalism and guides to decision making.
    3. Students will analyze a case study from various perspectives (individual, corporation, ideal (code of ethics where it is a maximum code), test student's ability to use the right model and/or tool for the job.
    4. Analyze examples of code that demonstrate the infeasibility of complete testing, give practical measures that can minimize risks/errors in code (for example, boundary value analysis, equivalence class partitioning), develop test plan or test cases (counter examples that can handle errors or fail gracefully).
    5. Correctness, reliability, and safety, provide case study for reliability (for example a vending machine that must work all the time, not sometimes), realize that software can be harmful (for example, medical imaging software).
  2. The social context and the digital divide. Social responsibility.
    1. Interpret the social context of a particular implementation. Perform analysis of the social impact of software. Write a statement role playing of various users (e.g. wheel chair bound, non-english, color blind). Force the use of readme files, help files, document codes, ensure virus free software.
    2. Identify assumptions and values embedded in a particular design. Make sure you have readme files, take a note of library, platform, and files you use in your program. Digitially sign software to ensure authenticity of code. Force registration or confirm ownership via a key.
  3. What they do now leads to the future.
    1. Examine historical examples of software risk, their context, origin, intended benefits versus harm. Unintended consequences (e.g. Three Mile Island, Mars Space problem where km vs. miles and the lack of using proper units caused an accident).
    2. Discuss implications of software complexity with regard to future trends. (e.g., Windows 2000 versus Linux, reduce complexity). Students need to write clean code. For example variables declaration should not be declared everywhere in Java. Use windows debugging code as an example of clean code.
    3. Examples from industry. Observe changes, software should be extendable, e.g. trojaned system. (government regulation changes, you write software that works for 5000 employees, and the company now has 6000 employees, your software should not collapse).
  4. Plagiarism.
    1. Link to institutional policy or school code and enforce it.
    2. Distinguish between copyrights, patents, and trade secrets (write a disclaimer for a dot.com website, read Java copyright from Sun, GNU copyright, Sun/Palm Nondisclosure agreements, visit to local IBM or company to see an example of nondisclosure).
    3. Debate software piracy from the perspective of multiple stakeholders, mps, napster discussion, DVD, software that facilitates or checks for copyright violation, DVD unlocking software, give an example of overseas policy on copyright (where it sometimes does not even exist), another example, international practices on copying books).
  5. Team building - the individual vs. the team.
    1. Peer evaluation, responsibilities, commitment to team plan, meetings).
  6. Sensitivity to social/culture/gender issues.
    1. Objectionable material, publicize policies (if they even exist).
  7. Risks and liabilities - benefit/harm analysis.
    1. Discuss time and safety critical software. Review examples of time critical software and safety critical software (for example ADA used by the military).
    2. Articulate viewpoints on technology, the role of technology to better society within the context of benefits, harm, and competing interests. (Bring a computer center director to talk about their policies).
  8. Appropriateness of projects.
    1. Define appropriate (for what and for whom). E.g. a program that checks validity passwords. It is appropriate for systems administrators to write and run, but is it appropriate for students?
    2. Dangerous software developed and tested in controlled environment. For example a flight control system or a heart monitoring system in a hospital so that the developer begins to understand the consequences.
  9. Intellectual freedom vs. appropriate behavior.
  10. Privacy.
    1. Summarize the legal bases for the right to privacy and freedom of expression in one's own nation and how those concepts vary from country to country.
    2. Describe current computer-based threats to privacy.
    3. Explain how the Internet may change the historical balance in protecting freedom of expression. (spam software, the Putnam Pit newspaper (free newsletter that someone was running, wanted access to government websites to collect cookies to see what websites government employees were visiting - court said no), cookie use).
  11. Surveillance.
Hardware and Systems (Architecture and Organization, Operating Systems, Human Computer Interaction, Information Management, Net-Centric Computing).

What are ethical and social issues in hardware and systems?
  1. Hardware.
    1. 1.1 Testing.
      1. identify accountability issues in testing hardware.
      2. identify actions that professionals can take when testing hardware.
      3. identify ethical choices in hardware testing.
    2. 1.2 Design.
      1. identify the need for accountability in hardware design.
      2. identify the ways that computers alter the social relations of people specifically in the context of hardware design.
      3. identify actions that professional can take during hardware design.
      4. identify ethical choices in hardware design.
      5. identify stakeholders and our obligations with relation to hardware design.
      6. discuss the implications of hardware design complexity and future trends.
      7. discuss privacy implications with regard to hardware design.
    3. 1.3 Administration.
      1. identify actions that professionals can take in hardware administration.
      2. identify ethical choices in hardware administration.
      3. identify stakeholders in a hardware administration issues and our obligations to them.
  2. Databases.
    1. 2.1 Testing.
      1. identify accountability issues in testing databases.
      2. identify actions that professionals can take when testing databases.
      3. identify ethical choices in database testing.
    2. 2.2 Design.
      1. identify the need for accountability in database design.
      2. identify the ways that computers alter the social relations of people specifically in the context of database design.
      3. identify actions that professional can take during database design, identify ethical choices in database design.
      4. identify stakeholders and our obligations with relation to database design.
      5. discuss the implications of database design complexity and future trends.
      6. discuss privacy implications with regard to database design.
    3. 2.3 Administration.
      1. identify the need for accountability in database administration.
      2. identify ways in which database administration can alter the social relations of people.
      3. identify actions that professionals can take in database administration, ethical choices in database administration.
      4. identify stakeholders in a database administration issues and our obligations to them.
      5. identify implications of administering complex databases with regard to future trends, and discuss privacy implication of administering databases.
  3. Human Computer Interaction.
    1. 3.1 Testing.
      1. identify accountability issues in testing human computer interaction.
      2. identify actions that professionals can take with regard to testing human computer interaction.
      3. identify ethical choices in testing human computer interaction.
    2. 3.2 Design.
      1. identify the need for accountability in the design of human computer interaction.
      2. identify the ways that computers alter the social relations of people, specifically in the context of designing human computer interaction.
      3. identify actions that professional can take with regard to designing human computer interaction.
      4. identify ethical choices in designing human computer interaction.
      5. identify stakeholders and our obligations with regard to designing human computer interaction.
      6. discuss the design implications of human computer interaction complexity and future trends.
  4. Networks and Infrastructures.
    1. 4.1 Testing.
      1. identify accountability issues with regard to testing networks and infrastructures.
      2. identify actions that professionals can take with regard to testing networks and infrastructures.
      3. identify ethical choices in testing networks and infrastructures.
    2. 4.2 Design.
      1. identify the need for accountability in network/infrastructure design.
      2. identify the ways that network/infrastructure design can alter the social relations of people.
      3. identify actions that professional can take during network/infrastructure design.
      4. identify ethical choices in network/infrastructure design.
      5. identify stakeholders and our obligations with relation to network/infrastructure design.
      6. discuss the implications of network/infrastructure design complexity and future trends.
      7. discuss privacy implications with regard to network/infrastructure design.
    3. 4.3 Administration.
      1. identify the need for accountability in network / infrastructure administration.
      2. identify ways in which network/infrastructure administration can alter the social relations of people.
      3. identify actions that professionals can take in network/infrastructure administration.
      4. identify ethical choices in network/infrastructure administration.
      5. identify stakeholders in a network/infrastructure administration issues and our obligations to them.
      6. identify implications of administering complex databases with regard to future trends.
  5. Open versus Closed Systems.
    1. 5.1 Design.
      1. identify actions that professionals can take in designing open versus closed systems.
      2. identify ethical choices designing open versus closed systems.
      3. identify stakeholders and our obligations to them when designing open versus closed systems.
      4. identify the implications of system complexity with regard to future trends when designing open versus closed systems.
  6. Architecture.
    1. 6.1 Testing.
      1. identify accountability issues with regard to testing architectures.
      2. identify actions that professionals can take with regard to testing architectures.
      3. identify ethical choices in testing architectures.
    2. 6.2 Design
      1. identify actions that professionals can take in designing architectures.
      2. identify ethical choices designing architectures.
      3. identify stakeholders and our obligations to them when designing architectures.
      4. identify the implications of system complexity with regard to future trends when designing architectures.
  7. Operating Systems.
    1. 7.1 Testing.
      1. identify accountability issues in testing operating systems.
      2. identify actions that professionals can take with regard to testing operating systems.
      3. identify ethical choices in testing operating systems.
    2. 7. 2 Design.
      1. identify the need for accountability in the design of operating systems.
      2. identify the ways that computers alter the social relations of people specifically in the context of designing operating systems.
      3. identify actions that professional can take with regard to designing operating systems.
      4. identify ethical choices in designing operating systems.
      5. identify stakeholders and our obligations with regard to designing operating systems.
      6. discuss the design implications of operating system complexity and future trends.
    3. 7.3 Administration.
      1. identify the need for accountability when administering operating systems.
      2. identify actions that professionals can take when administering operating systems.
      3. identify privacy implications that should be considered when administering operating systems.

Appendix D

Cluster

Issue

Learning Objective

Author

Sign Up

Introductory Courses (IC)
1. Make students aware of the ethical aspects of computing. a. identify the role of ethics in decision making and professional practice Rachel Borhauer
Srini Ramaswamy
b. identify the consequences of ethical and unethical behavior Rachel Borhauer
Srini Ramaswamy
c. explain the ACM code of ethics Ahmad Ghafarian
Jesse Yu
2. Identify the importance, complexity, consequences, and obligations of computing professionals. a. awareness of competing roles to employers, themselves, society, and their profession. Regina Lightfoot
b. identify stakeholders in an issue and our obligations to them. Regina Lightfoot
c. identify ethical choices.  
d. articulate ethical tradeoffs in a technical situation Srini Ramaswamy
3. The impact of computers in society and the wide effects outside computing professions. a. describe positive and negative ways in which computers alter social interactions of people Victor Akatsa
Srini Ramaswamy
b. identify global impacts on diverse cultures Victor Akatsa
c. discuss implications of differences in access and utilization to computing resources Victor Akatsa
4. Professional obligations in producing work product a. explain the role of problem set definition and system design Don Costello
b. explain the role of honesty in analysis and feasibility Rachel Borhauer
Dee Medley
c. explain the role of identifying risk and liabilities Melissa Dark
d. explain the role of testing in professional practice Jan Wilms
e. explain the role of team and individual responsibilities Ahmad Ghafarian
f. identify the concrete actions that professionals can take when faced with ethical dilemmas.  


Software and Programming (S/P)
1. The need for accountability a. take ownership and be able to stand up/speak out, give credit/acknowledgement to other people's work, and protect property Srini Ramaswamy
Kamsh Namuduri
b. specify strengths and weaknesses in relevant professional codes as expressions of professionalism and guides to decision making Kamsh Namuduri
c. analyze accountability from various perspectives (individual, corporation, society, user, etc) Kamsh Namuduri
Srini Ramaswamy
Melissa Dark
d. analyze examples of code that demonstrate the infeasibility of complete testing, give practical measures that can minimize risks/errors in code, develop test plan or test cases Ahmad Ghafarian
Kamesh Namuduri
e. discern the importance of correctness, reliability, and safety in programming Kamesh Namuduri
2. Social context, social responsibility, and the digital divide a. Interpret the social context of a particular implementation.  
b. perform analysis of the social impact of software.  
c. write a statement role playing various users. Greg Conti
d. force the use of readme files, help files, document codes, and ensure virus free software.  
e. identify assumptions and values embedded in a particular design.  
3. Future implications of current practice a. examine historical examples of software risk, their context, origin, intended benefits versus harm, and unintended consequences. Melissa Dark
b. discuss implications of software complexity with regard to future trends. Kamsh Namuduri
4. Plagiarism a. distinguish between copyright, patents, and trade secrets. Jesse Yu
b. debate software piracy from the perspective of multiple stakeholders  
5. Teamwork a. student should be aware of the need to work effectively within a team. Ahmad Ghafarian
Srini Ramaswamy
6. Sensitivity to social/cultural/gender issues a. identify objectionable material.  
7. Risks and liabilities a. discuss time and safety critical software. Srini Ramaswamy
Don Costello
b. articulate viewpoints on technology, the role of technology to better society within the context of benefits, harm, and competing interests Srini Ramaswamy
8. Appropriateness of student projects a. define appropriate (for what and for whom). Greg Conti
b. discuss intellectual freedom and appropriate behavior.  
9. Privacy a. summarize the legal basis for the right to privacy and freedom of expression in one's own nation. Jennifer Redmon
b. discuss how privacy concepts vary from country to country. Jennifer Redmon
c. explain how the Internet may change the historical balance in protecting freedom of expression. Dee Medley


Hardware and Systems
Hardware Testing
  1. identify accountability issues in testing hardware,
  2. identify actions that professionals can take when testing hardware,
  3. identify ethical choices in hardware testing
Design
  1. identify the need for accountability in hardware design,
  2. identify the ways that computers alter the social relations of people specifically in the context of hardware design,
  3. identify actions that professional can take during hardware design,
  4. identify ethical choices in hardware design,
  5. identify stakeholders and our obligations with relation to hardware design,
  6. discuss the implications of hardware design complexity and future trends,
  7. discuss privacy implications with regard to hardware design.
Administration
  1. identify actions that professionals can take in hardware administration,
  2. identify ethical choices in hardware administration,
  3. identify stakeholders in a hardware administration issues and our obligations to them.
Don Costello
Jan Wilms
Databases Testing
  1. identify accountability issues in testing databases,
  2. identify actions that professionals can take when testing databases,
  3. identify ethical choices in database testing.
Design
  1. identify the need for accountability in database design,
  2. identify the ways that computers alter the social relations of people specifically in the context of database design,
  3. identify actions that professional can take during database design, identify ethical choices in database design,
  4. identify stakeholders and our obligations with relation to database design,
  5. discuss the implications of database design complexity and future trends,
  6. discuss privacy implications with regard to database design.
Administration
  1. identify the need for accountability in database administration,
  2. identify ways in which database administration can alter the social relations of people,
  3. identify actions that professionals can take in database administration, ethical choices in database administration,
  4. identify stakeholders in a database administration issues and our obligations to them,
  5. identify implications of administering complex databases with regard to future trends, and discuss privacy implication of administering databases.
Regina Lightfoot
Human Computer Interaction Testing
  1. identify accountability issues in testing human computer interaction,
  2. identify actions that professionals can take with regard to testing human computer interaction,
  3. identify ethical choices in testing human computer interaction.
Design
  1. identify the need for accountability in the design of human computer interaction,
  2. identify the ways that computers alter the social relations of people specifically in the context of designing human computer interaction,
  3. identify actions that professional can take with regard to designing human computer interaction,
  4. identify ethical choices in designing human computer interaction,
  5. identify stakeholders and our obligations with regard to designing human computer interaction,
  6. discuss the design implications of human computer interaction complexity and future trends.
Don Costello
Networks and Infrastructures Testing
  1. identify accountability issues with regard to testing networks and infrastructures,
  2. identify actions that professionals can take with regard to testing networks and infrastructures,
  3. identify ethical choices in testing networks and infrastructures.
Design
  1. identify the need for accountability in network/infrastructure design,
  2. identify the ways that network/infrastructure design can alter the social relations of people,
  3. identify actions that professional can take during network/infrastructure design,
  4. identify ethical choices in network/infrastructure design,
  5. identify stakeholders and our obligations with relation to network/infrastructure design,
  6. discuss the implications of network/infrastructure design complexity and future trends,
  7. discuss privacy implications with regard to network/infrastructure design.
Administration
  1. identify the need for accountability in network / infrastructure administration,
  2. identify ways in which network/infrastructure administration can alter the social relations of people,
  3. identify actions that professionals can take in network/infrastructure administration,
  4. identify ethical choices in network/infrastructure administration,
  5. identify stakeholders in a network/infrastructure administration issues and our obligations to them,
  6. identify implications of administering complex
databases with regard to future trends.
Ramen Saheb
Don Costello
Open versus closed systems Design
  1. identify actions that professionals can take in designing open versus closed systems,
  2. identify ethical choices designing open versus closed systems,
  3. identify stakeholders and our obligations to them when designing open versus closed systems,
  4. identify the implications of system complexity with regard to future trends when designing open versus closed systems.
Ramen Saheb
Architecture Testing
  1. identify accountability issues with regard to testing architectures,
  2. identify actions that professionals can take with regard to testing architectures,
  3. identify ethical choices in testing architectures.
Design
  1. identify actions that professionals can take in designing architectures,
  2. identify ethical choices designing architectures,
  3. identify stakeholders and our obligations to them when designing architectures,
  4. identify the implications of system complexity with regard to future trends when designing architectures.
 
Operating systems Testing
  1. identify accountability issues in testing operating systems,
  2. identify actions that professionals can take with regard to testing operating systems,
  3. identify ethical choices in testing operating systems.
Design
  1. identify the need for accountability in the design of operating systems,
  2. identify the ways that computers alter the social relations of people specifically in the context of designing operating systems,
  3. identify actions that professional can take with regard to designing operating systems,
  4. identify ethical choices in designing operating systems,
  5. identify stakeholders and our obligations with regard to designing operating systems,
  6. discuss the design implications of operating system complexity and future trends.
Administration
  1. identify the need for accountability when administering operating systems
  2. identify actions that professionals can take when administering operating systems
  3. identify privacy implications that should be considered when administering operating systems.