The Center for Education and Research in Information Assurance and Security (CERIAS)

The Center for Education and Research in
Information Assurance and Security (CERIAS)

CERIAS Blog

Page Content

Speculations on Teaching Secure Programming

Share:

I have taught secure programming for several years, and along the way I developed a world view of how teaching it is different from teaching other subject matters.  Some of the following are inferences from uncontrolled observations, others are simply opinions or mere speculation.  I expose this world view here, hoping that it will generate some discussions and that flaws in it will be corrected. 

As other fields, software security can be studied from several different aspects, such as secure software engineering, secure coding at a technical level, architecture, procurement, configuration and deployment.  Similarly to other fields, effective software security teaching depends on the audience—its needs, its current state and capabilities, and its potential for learning.  Learning techniques such as repetition are useful, and students can ultimately benefit from organized, abstracted thought on the subject.  However, teaching software security is different from teaching other subjects because it is not just teaching facts (data), “how to” (skills) and theories and models (knowledge), but also a mindset and the capability to repeatably derive and achieve a form of wisdom in various, even new situations.  It’s not just a question of the technologies used or the degree of technological acumen, but of behavioral psychology, economy, motivation and humor.

Behavioral Psychology— Security is somewhat of a habit, an attitude, a way of thinking and life.  You won’t become a secure programmer just because you learned of a new vulnerability, exploit or security trick today, although it may help and have a cumulative effect.  Attacking requires opportunistic, lateral, experimental thinking with exciting rewards upon success.  It somewhat resembles the capability to create humor by taking something out of the context for which it was created and subjecting it to new, unexpected conditions.  I am also surprised sometimes by the amount of perseverance and dedication attackers demonstrate.  Defending requires vigilance and a systematic, careful, most often tedious labor and thought, which are rewarded slowly by “uptime” or long-term peace.  They are different, yet understanding one is a great advantage to the other.  To excel at both simultaneously is difficult, requires practice and is probably not achievable by everyone.  I note that undergraduate computer science rewards passing tests, including sometimes provided software tests for assignments, which are closer to immediate rewards upon success or immediate failure, with no long-term consequences or requirements.  On top of that, assignments are most often evaluated solely on achieving functionality, and not on preventing unintended side-effects or not allowing other things to happen.  I suspect that this produces graduates with learned behaviors unfavorable to security.  The problem with behaviors is that you may know better than what you’re doing, but you do it anyways.  Economy may provide some limited justification.

Economy—Many people know that doing things securely is “better”, and that they ought to, but it costs.  People are “naturally optimizing” (lazy)—they won’t do something if there’s no perceived need for it, or if they can delay paying the costs or ultimately pay only the necessary ones (“late security” as in “late binding”).  This is where patches stand;  vulnerability disclosures and patches are remotely possible costs to be weighted against the perceived opportunity costs of delays and additional production expenses.  Isolated occurrences of exploits and vulnerability disclosures may be dismissed as bad luck, accidents or something that happens to other projects.  An intense scrutiny of some works may be necessary to demonstrate to a product’s team that their software engineering methods and security results are flawed.  There is plenty of evidence that these attempts at evading costs don’t work well and often backfire. 
Even if change is desired, students can graduate with negligible knowledge of the best practices presented in the SOAR on Software Security Assurance 2007.  Computer science programs are strained by the large amount of knowledge that needs to be taught; perhaps software engineering should be spun off, just like electrical engineering was spun off from physics.  Companies that need software engineers, and ultimately our economy, would be better served by that than by getting students that were just told to “go and create a program that does this and that”.  While I was revising these thoughts, “Crosstalk” published some opinions on the use of Java for teaching computer science, but the title laments “where are the software engineers of tomorrow?”  I think that there is just not enough teaching time to educate people to become both good computer scientists and software engineers, and the result is something that satisfies the need for neither.  Even if new departments aren’t created, two different degrees should probably be offered.

Motivation—For many, trying to teach software security will be in one ear, out the other unless consequences are demonstrated.  Most people need to be shown the exploits that a flaw enables, to believe that it is a serious flaw.  This resembles how a kid may ignore warnings about burns and hot things until a burn is experienced.  Even as teenagers and adults, every summer some people have to re-learn how sunscreen is needed, and the possibility of skin cancer is too remote a consideration for others.  So, security teaching needs to contain a lot of anecdotes and examples of bad things that happened.  I like to show real code in class and analyze the mistakes that were made;  that approach seems to get the interest of undergraduates.  At a later stage, this will evolve from “security prevents bad things” to “with security you can do this safely”.  Actualizing secure programming can make it even more interesting and exciting, by discussing current events in class.

Repetition—Repeated experiences reinforce learning.  Security-focused code scanners repeat and reinforce good coding practice, as long as the warnings are not allowed to be ignored.  Code audits reinforce the message, this time coming from peers, and so result in peer pressure and the risk of shame.  They are great in a company, but I am ambivalent about using code audits by other students, due to the risk of humiliation—humiliation is not appropriate while learning, for many reasons.  Also, the students doing the audit may not be competent yet, by definition, and I’m not sure how I would grade the activity.  Code audits by the teacher do not scale well.  This leaves scanners.  I have been looking into it and I tried some commercial code scanners, but what I’ve seen are systems that are unmanageable for classroom use and don’t catch some of the flaws I wish they would. 

Organization and abstraction—Whereas showing exploits and attacks is good for the beginner, more advanced students will want to move away from black lists of things not to do (e.g., “Deadly Sins”) to good practices, assurance, and formal methods.  I made a presentation on the subject almost two years ago.

In conclusion, teaching secure programming differs from typical subject matters because of how the knowledge is utilized;  it needs to change behaviors and attitudes;  and it benefits from different tools and activities.  It is interesting in how it connects with morality.  Whereas these characteristics aren’t unique in the entire body of human knowledge, they present interesting challenges.

ReAssure Version 1.01 Released

Share:

As the saying goes, version 1.0 always has bugs, and ReAssure was no exception.  Version 1.01 is a bug-fix release for broken links and the like;  there were no security issues.  Download the source code in Ruby here, or try it there.  ReAssure is the virtualization (VMware and UML) experimental testbed built for containment and networking security experiments.  There are two computers for creating and updating images, and of course you can use VMware appliances.  The other 19 computers are hooked to a Gbit switch configured on-the-fly according to the network topology you specified, with images being transfered, setup and started automatically.  Remote access is through ssh for the host OS, and through NX (think VNC) or the VMware console for the guest OS.

Another untimely passing

Share:

[tags]obituary,cryptography,Bob Baldwin,kuang, CBW,crypt-breaker’s workbench[/tags]

I learned this week that the information security world lost another of our lights in 2007: Bob Baldwin. This may have been more generally known, but a few people I contacted were also surprised and saddened by the news.

His contributions to the field were wide-ranging. In addition to his published research results he also built tools that a generation of students and researchers found to be of great value. These included the Kuang tool for vulnerability analysis, which we included in the first edition of COPS, and the Crypt-Breaker’s Workbench (CBW), which is still in use.

What follows is (slightly edited) obituary sent to me by Bob’s wife, Anne. There was also an obituary in the fall 2007 issue of Cryptologia.

Robert W Baldwin

May 19, 1957- August 21, 2007

Robert W. Baldwin of Palo Alto passed away at home with his wife at his side on August 21, 2007. Bob was born in Newton, Massachusetts and graduated from Memorial High School in Madison, Wisconsin and Yorktown High School in Arlington, Virginia. He attended the Massachusetts Institute of Technology, where he received BS and MS degrees in Computer Science and Electrical Engineering in 1982 and a Ph.D. in Computer Science in 1987. A leading researcher and practitioner in computer security, Bob was employed by Oracle, Tandem Computers, and RSA Security before forming his own firm, PlusFive Consulting. His most recent contribution was the development of security engineering for digital theaters. Bob was fascinated with cryptology and made frequent contributions to Cryptologia as an author, reviewer, and mentor.

Bob was a loving and devoted husband and father who touched the hearts and minds of many. He is well remembered by his positive attitude and everlasting smile. Bob is survived by his wife, Anne Wilson, two step-children, Sean and Jennifer Wilson of Palo Alto and his two children, Leila and Elise Baldwin of Bellevue, Washington. He is also survived by his parents, Bob and Janice Baldwin of Madison, Wisconsin; his siblings: Jean Grossman of Princeton, N.J., Richard Baldwin of Lausanne, Switzerland, and Nancy Kitsos of Wellesley, MA.; and six nieces and nephews.

In lieu of flowers, gifts in memory of Robert W. Baldwin may be made to a charity of the donor’s choice, to the Recht Brain Tumor Research Laboratory at Stanford Comprehensive Cancer Center, Office of Medical Development, 2700 Sand Hill Road, Menlo Park, CA 94025, Attn: Janice Flowers-Sonne, or to the loving caretakers at the Hospice of the Valley, 1510 E. Flower Street. Phoenix, AZ 85014-5656.

 

Passing of a Pioneer

Share:

On November 18, 2007, noted computer pioneer James P. Anderson, Jr., died at his home in Pennsylvania. Jim, 77, had finally retired in August.

Jim, born in Easton, Pennsylvania, graduated from Penn State with a degree in Meteorology. From 1953 to 1956 he served in the U.S. Navy as a Gunnery Officer and later as a Radio Officer. This later service sparked his initial interest in cryptography and information security.

Jim was unaware in 1956, when he took his first job at Univac Corporation, that his career in computers had begun. Hired by John Mauchly to program meteorological data, Dr. Mauchly soon became a family friend and mentor. In 1959, Jim went to Burroughs Corporation as manager of the Advanced Systems Technology Department in the Research Division, where he explored issues of compilation, parallel computing, and computer security. While there, he conceived of and was one of the patent holders of one of the first multiprocessor systems, the D-825. After being manager of Systems Development at Auerbach Corporation from 1964 to 1966, Jim formed an independent consulting firm, James P. Anderson Company, which he maintained until his retirement.

Jim's contributions to information security involved both the abstract and the practical. He is generally credited with the invention and explication of the reference monitor (in 1972) and audit trail-based intrusion detection (in 1980). He was involved in many broad studies in information security needs and vulnerabilities. This included participation on the 1968 Defense Science Board Task Force on Computer Security that produced the "Ware Report", defining the technical challenges of computer security. He was then the deputy chair and editor of a follow-on report to the U.S. Air Force in 1972. That report, widely known as "The Anderson Report", defined the research agenda in information security for well over a decade. Jim was also deeply involved in the development of a number of other seminal standards, policies and over 200 reports including BLACKER, the TCSEC (aka "The Orange Book"), TNI, and other documents in "The Rainbow Series".

Jim consulted for major corporations and government agencies, conducting reviews of security policy and practice. He had long- standing consulting arrangements with computer companies, defense and intelligence agencies and telecommunication firms. He was a mentor and advisor to many in the community who went on to prominence in the field of cyber security. Jim is well remembered for his very practical and straightforward analyses, especially in his insights about how operational security lapses could negate strong computing safeguards, and about the poor quality design and coding of most software products.

Jim eschewed public recognition of his many accomplishments, preferring that his work speak for itself. His accomplishments have long been known within the community, and in 1990 he was honored with the NIST/NCSC (NSA) National Computer Systems Security Award, generally considered the most prestigious award in the field. In his acceptance remarks Jim observed that success in computer security design would be when its results were used with equal ease and confidence by average people as well as security professionals - a state we have yet to achieve.

Jim had broad interests, deep concerns, great insight and a rare willingness to operate out of the spotlight. His sense of humor and patience with those earnestly seeking knowledge were greatly admired, as were his candid responses to the clueless and self-important.

With the passing of Jim Anderson the community has lost a friend, mentor and colleague, and the field of cyber security has lost one of its founding fathers.

Jim is survived by his wife, Patty, his son Jay, daughter Beth and three grandchildren. In lieu of other recognition, people may make donations to their favorite charities in memory of Jim.

[Update 01/03/2008 from Peter Denning:]

I noted a comment that Jim is credited with the reference monitor. He told me once that he credits that to a paper I wrote with Scott Graham for the 1972 SJCC and said that paper was the first he'd seen using the actual term. I told him that I got the concept (not the term) from Jack Dennis at MIT. Jack probably got it from the ongoing Project MAC discussions. Where it came from before that, I do not know. It might be better to say that Jim recognized the fundamental importance of reference monitor for computer security practice and stumped endlessly for its adoption.

Computer Security Outlook

Share:

Recently, the McAfee Corporation released their latest Virtual Criminology Report.  Personnel from CERIAS helped provide some of the research for the report.
The report makes interesting reading, and you might want to download a copy.  You will have to register to get a copy, however (that’s McAfee, not CERIAS).

The editors concluded that there are 3 major trends in computer security and computer crime:

  1. An increasing level and sophistication of nation-state sponsored espionage and (some) sabotage.
  2. An increasing sophistication in criminal threats to individuals and businesses
  3. An increasing market for exploits and attack methods

Certainly, anyone following the news and listening to what we’ve been saying here will recognize these trends.  All are natural consequences of increased connectivity and increased presence of valued information and resources online, coupled with weak security and largely ineffectual law enforcement.  If value is present and there is little or no protection, and if there is also little risk of being caught and punished, then there is going to be a steady increase in system abuse.

I’ve posted links on my tumble log to a number of recent news articles on computer crime and espionage.  It’s clear that there is a lot of misuse occurring, and that we aren’t seeing it all.

[posted with ecto]