I have a set of keywords registered with Google Alerts that result in a notification whenever they show up in a new posting. This helps me keep track of some particular topics of interest.
One of them popped up recently with a link to a review and some comments about a book I co-authored (Practical Unix & Internet Security, 3rd Edition). The latest revision is over 6 years old, but still seems to be popular with many security professionals; some of the specific material is out of date, but much of the general material is still applicable and is likely to be applicable for many years yet to come. At the time we wrote the first edition of the book there were only one or two books on computer security, so we included more material to make this a useful text and reference.
In general, I don't respond to reviews of my work unless there is an error of fact, and not always even then. If people like the book, great. If they don't, well, they're entitled to their opinions -- no matter how ignorant and ill-informed they may be.
This particular posting included reviews from Amazon that must have been posted about the 2nd edition of the book, nearly a decade old, although their dates as listed on this site make it look like they are recent. I don't recall seeing all of the reviews before this.
One of the responses in this case was somewhat critical of me rather than the book: the text by James Rothschadl. I'm not bothered by his criticism of my knowledge of security issues. Generally, hackers who specialize in the latest attacks dismiss anyone not versed in their tools as ignorant, so I have heard this kind of criticism before. It is still the case that the "elite" hackers who specialize in the latest penetration tools think that they are the most informed about all things security. Sadly, some decision-makers believe this too, much to their later regret, usually because they depend on penetration analysis as their primary security mechanism.
What triggered this blog posting was when I read the comments that included the repetition of erroneous information originally in the book Underground by Suelette Dreyfus. In that book, Ms. Dreyfus recounted the exploits of various hackers and miscreants -- according to them. One such claim, made by a couple of hackers, was that they had broken into my account circa 1990. I do not think Ms. Dreyfus sought independent verification of this, because the story is not completely correct. Despite this, some people have gleefully pointed this out as "Spaf got hacked."
There are two problems with this tale. First, the computer account they broke into was on the CS department machines at Purdue. It was not a machine I administered (and for which I did not have administrator rights) -- it was on shared a shared faculty machine. Thus, the perps succeeded in getting into a machine run by university staff that happened to have my account name but which I did not maintain. That particular instance came about because of a machine crash, and the staff restored the system from an older backup tape. There had been a security patch applied between the backup and the crash, and the staff didn't realize that the patch needed to be reapplied after the backup.
But that isn't the main problem with this story: rather, the account they broke into wasn't my real account! My real account was on another machine that they didn't find. Instead, the account they penetrated was a public "decoy" account that was instrumented to detect such behavior, and that contained "bait" files. For instance, the perps downloaded a copy of what they thought was the Internet Worm source code. It was actually a copy of the code with key parts missing, and some key variables and algorithms changed such that it would partially compile but not run correctly. No big deal.
Actually, I got log information on the whole event. It was duly provided to law enforcement authorities, and I seem to recall that it helped lead to the arrest of one of them (but I don't recall the details about whether there was a prosecution -- it was 20 years ago, after all).
At least 3 penetrations of the decoy account in the early 1990s provided information to law enforcement agencies, as well as inspired my design of Tripwire. I ran decoys for several years (and may be doing so to this day . I always had a separate, locked down account for personal use, and even now keep certain sensitive files encrypted on removable media that is only mounted when the underlying host is offline. I understand the use of defense-in-depth, and the use of different levels of protection for different kinds of information. I have great confidence in the skills of our current system admins. Still, I administer a second set of controls on some systems. But i also realize that those defenses may not be enough against really determined, resourced attacks. So, if someone wants to spend the time and effort to get in, fine, but they won't find much of interest -- and they may be providing data for my own research in the process!
Country | Number of Stores |
---|---|
USA | 126 |
Europe | 183 |
Thailand | 439 |
Taiwan | 144 |
Japan | 105 |
China | 90 |
Singapore | 65 |
Malaysia | 27 |
Hong Kong | 20 |
Vietnam | 17 |
Australia | 13 |
India | 7 |
Others | 0 |
So, here we are, in November already. We've finished up with National Cyber Security Awareness Month — feel safer? I was talking with someone who observed that he remembered "National Computer Security Day" (started back in the late 1990s) that then became "National Computer Security Week" for a few years. Well, the problems didn't go away when everyone started to call it "cyber," so we switched to a whole month but only of "awareness." This is also the "Cyber Leap Ahead Year." At the same level of progress, we'll soon have "The Decade of Living Cyber Securely." The Hundred Years' War comes to mind for some reason, but I don't think our economic system will last that long with losses mounting as they are. The Singularity may not be when computers become more powerful than the human mind, but will be the point at which all intellectual property, national security information, and financial data has been stolen and is no longer under the control of its rightful owners.
Overly gloomy? Perhaps. But consider that today is also the 21st anniversary of the Morris Internet Worm. Back then, it was a big deal because a few thousand computers were affected. Meanwhile, today's news has a story about the Conficker worm passing the 7 million host level, and growing. Back in 1988 there were about 100 known computer viruses. Today, most vendors have given up trying to measure malware as the numbers are in the millions. And now we are seeing instances of fraud based on fake anti-malware programs being marketed that actually infect the hosts on which they are installed! The sophistication and number of these things are increasing non-linearly as people continue to try to defend fundamentally unsecurable systems.
And as far as awareness goes, a few weeks ago I was talking with some grad students (not from Purdue). Someone mentioned the Worm incident; several of the students had never heard of it. I'm not suggesting that this should be required study, but it is indicative of something I think is happening: the overall awareness of security issues and history seems to be declining among the population studying computing. I did a quick poll, and many of the same students only vaguely recalled ever hearing about anything such as the Orange Book or Common Criteria, about covert channels, about reference monitors, or about a half dozen other things I mentioned. Apparently, anything older than about 5 years doesn't seem to register. I also asked them to name 5 operating systems (preferably ones they had used), and once they got to 4, most were stumped (Windows, Linux, MacOS and a couple said "Multics" because I had asked about it earlier; one young man smugly added "whatever it is running on my cellphone," which turned out to be a Windows variant). No wonder everyone insists on using the same OS, the same browser, and the same 3 programming languages — they have never been exposed to anything else!
About the same time, I was having a conversation with a senior cyber security engineer of a major security defense contractor (no, I won't say which one). The engineer was talking about a problem that had been posed in a recent RFP. I happened to mention that it sounded like something that might be best solved with a capability architecture. I got a blank look in return. Somewhat surprised, I said "You know, capabilities and rings — as in Multics and System/38." The reaction to that amazed me: "Those sound kinda familiar. Are those versions of SE Linux?"
Sigh. So much for awareness, even among the professionals who are supposed to be working in security. The problems are getting bigger faster than we have been addressing them, and too many of the next generation of computing professionals don't even know the basic fundamentals or history of information security. Unfortunately, the focus of government and industry seems to continue to be on trying to "fix" the existing platforms rather than solve the actual problems. How do we get "awareness" into that mix?
There are times when I look back over my professional career and compare it to trying to patch holes in a sinking ship while the passengers are cheerfully boring new holes in the bottom to drop in chum for the circling sharks. The biggest difference is that if I was on the ship, at least I might get a little more sun and fresh air.
More later.