Security Policy


A Security Policy defines what is permitted and what is denied on a system. There are two basic philosophies behind any security policy:

  1. Prohibitive where everything that is not expressly permitted is denied.
  2. Permissive where everything that is not expressly denied is permitted.

Generally, a site that is more paranoid (concerned) about security will take the first option. A policy will be in place which details exactly what operations are allowed on a system. Any operation that is not detailed in this policy document will be considered illegal on the system. As you can imagine, this lends itself well to a military mindset, and these types of policy are rare in civilian establishments.

More in tune with the spirit of computing is the second philosophy. Historically, computer users have tried to use a machine's potential to its fullest, even if this meant bending the rules slightly. Anything was acceptable as long as the job got done in the end of the day, nobody got hurt and everyone had fun in the process.

Unfortunately, this philosophy does not work well in today's computing environments. Users have not come to a system learning to respect other users' privacy. The competitive spirit tends to cloud over the ethical issues involved in meddling with another user's files. Indeed, outright sabotage may be a norm in some environments where competition for survival has gone to the extreme.

Most users will behave according to a set of "societal" rules. These rules encourage them to respect each other's privacy and work environments. Such a population of users has a working alliance based on trust, and trust is easy to subvert. A population of trusting users is easily invaded by a malicious user, intent on misusing any system in his or her path.

In both these examples, a well known, documented, and enforced set of rules would maintain every user's privacy and integrity. The rules must be enforcable, because there is no point in making rules that cannot be enforced, and be seen to be enforced. As someone once said:

justice must be done, and it must be seen to be done

Elements of a System's Security

A computer system can be considered as a set of resources which are available for use by authorized users. A paper by Donn Parker [Parker:94] outlines six elements of security that must be addressed by a security administrator. It is worth evaluting any tool by determining how it address these six elements.

  1. Availability - the system must be available for use when the users need it. Similarly, critical data must be available at all times.
  2. Utility - the system, and data on the system, must be useful for a purpose.
  3. Integrity - the system and its data must be complete, whole, and in a readable condition.
  4. Authenticity - the system must be able to verify the identity of users, and the users should be able to verify the identity of the system.
  5. Confidentiality - private data should be known only to the owner of the data, or to a chosen chosen few with whom the owner shares the data.
  6. Possession - the owners of the system must be able to control it. Losing control of a system to a malicious user affects the security of the system for all other users.

Introduction Start Intrusion Classification

Diego Zamboni
Last modified: Mon Sep 20 13:17:09 EST 1999