Posts by spaf

Initial Thoughts on the RSA 2015 Conference

One again I have submitted myself to a week of talks, exhibits, walking, meetings, drinking, meetings, and more with 40,000 close associates (with one more day of it tomorrow). It’s the annual RSA conference in San Francisco. I’ve been to about 8, including the last 5.

Prior to starting this entry, I reread my blog post from after the 2014 RSA Conference. Not a lot has changed, at least as far as talks and exhibits. Pretty much everything I wrote last year is still accurate, so you can read that first. There were a few differences, and I’ll describe the prominent ones below.

Once again, I got pulled into meetings and conversations, so I didn’t attend as many of the talks as I really wanted. I caught portions of several, and I was impressed with more this year than last — I sensed less marketing. Thus, kudos to the program committee (and speakers). I am sorry I didn’t get to hear more of the talks. I hope they were recorded for us to view later.

Foremost differences from last year occurred outside the Moscone Center and on the exhibit floor — there was no boycott against RSA about alleged NSA collaboration, and the conference organizers adopted a policy against “booth babes” — yay! I don’t think I need to write about things that weren’t there this year, but I will say a big “thank you” to the RSA Conference team for the latter — it was a very welcome change.

  1. Last year’s big buzz phrase was “threat intelligence” with “big data” coming in second. This year, it was “IoT” with maybe “cloud” as second. i didn’t see much mention of “big data” in the materials or on the booths. There was some use of the term in presentations, however.
  2. Out of 400 booths I really only saw 2 or 3 totally new concepts. All the other products and services on display were either holdovers from prior years, of variations on older ideas.
  3. Many of the booth personnel were more cynical than last year about the conference, the field, their products, etc. This marks an interesting change: in prior years I barely detected cynicism.
  4. There seemed to be a little more international representation than last year — companies originating in other countries (Germany, Japan, China, Sweden, Korea, Taiwan, and Israel are ones I can recall).

I still did not speak in a session (even as a fill-in), it still costs quite a bit to attend, I still didn’t see many academics I knew,  

I saw only 3 products that were devoted to building secure systems — everything else was patching, monitoring, remediation, and training. That continues to be depressing.

Still the case there was limited emphasis on or solutions for privacy.

Andy Ellis provided me shielding for my badge so I could avoid being scanned onto mailing lists. I told people at most booths, but they tried anyhow. Some would try repeatedly, then tell me they couldn’t scan my badge. Duh! I just told you that! However, in every case, they still gave me a T-shirt or other swag.

Speaking of swag, this year, the top 3 raffle items were drones, Go-Pro cameras, and iWatches.

A few booths were very aggressive in trying to scan people. It almost felt like desperation. I had to duck and weave (not easy with a cracked rib) to avoid a few of those people and get past their booths. It felt like being in a video game.

This year, more vendors seemed willing to talk about donating their products to our (CERIAS) teaching and research labs. That is really promising, and helps our students a lot. (And, hint — it provides great visibility for the products, so you vendors can still do it!)

So, if I find the conference a little depressing, why do I still go? As I noted last year, besides hearing about trends and getting a stock of T-shirts, it is a great opportunity to see friends and acquaintances I don’t get to see that often otherwise because I have limited time and funds for travel. (And yes, Indiana is at the center of the known universe, but few flights stop here.) I have had some great conversations with these people — thought leaders and deep thinkers across the spectrum of infosec/cyber/etc.

Actually, it occurred to me over drinks that if I wanted to cause maximum disruption, I could have infected these highly-connected people with some awful disease, and within 72 hours they would have infected almost everyone in the field who have some level of clue. Luckily for the world, they only had to put up with my presence for a few minutes or so, each, and that isn’t contagious.

Here’s a partial list of the people I was happy to see (there were more, but this is who I can remember right now — my apologies for anyone I missed; plus, I may see more in the closing session tomorrow): Candy Alexander, Becky Bace, Robert Bigman, Bob Blakely, Josh Corman, Sam Curry, Jack Daniel, Michelle Dennedy, Matt Devost, Whit Diffie, Andy Ellis, Karen Evans, Dickie George, Greg Hogland, Brian Honan, Alex Hutton, Andrew Jacquith, Toney Jennings, John Johsnson, Gene Kim, Brian Krebs, Penny Leavy, Martin Libicki, Rich Marshall, Gary McGraw, Martin McKeay, Carey Nachenberg, Wendy Nather, Davi Ottenheimer, Andy Ozment, Kevin Poulsin, Paul Rosenzweig, Scott Rotondo, Marc Sachs, Howard Schmidt, Bruce Schneier, Corey Schou, Winn Schwartau, Chenxi Wang, Mark Weatherford, Bob West, Ira Winkler, and Amit Yoran.

Yes, I do know a rather eclectic set of people. Their karma must be bad, because they also know me.

Speaking of karma, I’m already planning to go to RSA 2016.

Buy a book for entertainment and for charity

I’ve known Carey Nachenberg, a Fellow at Symantec, for many, many years. He’s one of the driving forces behind Symantec’s anti-malware software. He’s creative and passionate about cyber security. He’s also an avid rock climber, a teacher, and several other things that make him an interesting person to know.

Now Carey is also a published author of fiction: the adventure novel The Florentine Deception.1425186604.png

I can recommend the book for several reasons. First, it’s an engaging story, with several convincing core plot devices — Carey has taken several of his passions and woven them together into the story. Second, all the proceeds go to charities. Carey has selected several worthwhile causes, and the more books people buy, the more the charities benefit. And third, there is this really odd coincidence that ties Carey’s plot to something a cyber security hack researcher actually wrote about 20 years ago and describes in the Foreword. Carey intended the book as fiction, but it could also be a cautionary tale…or a somewhat embellished version of something frightening that really happened?

As a freshman outing in fiction, the book could have used a little more editing, but still provides a good read. As a tale of unexpected consequences, it really nails one of several cyber issues that has received insufficient consideration. And as an effort to support some worthwhile causes, how can it possibly be ignored?

I encourage you to visit the website for the book, and follow one of the links to purchase a copy. Then enjoy the read, and think about what The Florentine Deception might really mean.

CERIAS 2015 Symposium Now Online!

The 2015 CERIAS symposium — held March 24 & 25, 2015 — was wonderful! We had a great array of speakers and panels, and one of our largest audiences in years. The talks were fascinating, the panels provocative, and the student research exciting (as usual).

Featured speakers included Sam Curry, CTO and CSO, Arbor Networks; Deborah Frincke, Director of Research, NSA/CSS; and Michelle Dennedy, VP & CPO McAfee/Intel Security.

If you were there and want to hear a repeat of a talk, or if you didn’t make it to the symposium and want to hear what went on, visit our website. We have videos of all the talks and panels plus links to the student research posters and other materials. Similar materials from our 2014 symposium are still online, too!

We haven’t yet set the dates for the 2016 CERIAS Symposium, but stay tuned for that.

What is wrong with all of you? Reflections on nude pictures, victim shaming, and cyber security

[This blog post was co-authored by Professor Samuel Liles and Spaf.]

.Over the last few days we have seen a considerable flow of news and social media coverage of untended exposure of celebrity photographs (e.g., here). Many (most?) of these photos were of attractive females in varying states of undress, and this undoubtedly added to the buzz.

We have seen commentary from some in the field of cybersecurity, as well as more generally-focused pundits, stating that the subjects of these photos “should have known better.” These commentators claim that it is generally known that passwords/cloud storage/phones have weak security, so the victims only have themselves to blame.

We find these kinds of comments ill-informed, disappointing, and in many cases, repugnant.

First, we note that the victims of these leaks were not trained in cyber security or computing. When someone falls ill, we don’t blame her for not having performed studies in advanced medicine. When someone’s car breaks down, we don’t blame him for failing to have a degree in mechanical engineering. Few people are deeply versed in fields well outside their profession.

The people involved in these unauthorized exposures apparently took prudent measures they were instructed to on the systems as they were designed. As an example, the passwords used must have passed the checks in place or they would not have been able to set them. It doesn’t matter if we approve of the passwords that passed those automated checks -- they were appropriate to the technical controls in place. What the people stored, how they did it, or the social bias against their state of dress has nothing to do with this particular issue.

Quite simply, the protection mechanisms were not up to the level of the information being protected. That is not the users’ fault. They were using market standards, represented as being secure. For instance, it is admitted that Apple products were among those involved (and that is the example in some of our links). People have been told for almost a decade that the iOS and Apple ecosystem is much more secure than other environments. That may or may not be true, but it certainly doesn’t take into account the persistent, group effort that appears to have been involved in this episode, or some of the other criminal deviants working in groups online. We have met a few actresses and models, and many young people. They don’t think of risk in the same way security professionals do, and having them depend on standard technology alone is clearly insufficient against such a determined threat.

Consider: assume you live in a nice house. You’ve got windows, doors, and locks on those windows and doors. You likely have some kind of curtains or window coverings. If you live in a house, even a house with no yard, if you close your curtains we accept that as a request for privacy. If I walk up on the sidewalk and attempt to peer into your windows, that is being a “peeping tom.” Even though I might have every right to stand on the sidewalk, face the direction I’m looking, and stop or pause, I do not have the right to violate your privacy.

Consider: Your house has a nice solid door with a nice lock. That lock likely has orders of magnitude less entropy than a password. Every night you walk through your house, lock your doors, and you go to sleep believing you are likely safe. Yet, that lock and that door will not stop a group of determined, well-equipped attackers or likely even slow them down. The police will not arrive for some amount of time and effective self-protection against unexpected provocation by a gang is uncertain, at best. As a polite and law-abiding society, we respect the door and the lock, and expect others to do the same. We understand that the door and lock keep honest people honest. They set a barrier to entry for criminals. Burglaries still happen and we use the power of law to enact punishment against criminals, although many crimes go unsolved.

If an unauthorized entry to your house occurs, whether by picking the lock, climbing through the window, or discovering a loose set of boards in the wall, we would never blame you, the victim — it is clear that the person who entered, unbidden, was to blame. Some of our peers would try to blame the companies that made the locks and windows, rather than acknowledge the bad intent of the burglar. Too many people in information security tend to think we can always build better locks, or that having “white hats” continually picking locks somehow will lead to them being unpickable. Many are so enamored of the technology of those locks and the pastime of picking them that they will blame the victim instead of anything to do with the production or design of the locks themselves. (This is not a new phenomenon: Spafford wrote about this topic 22 years ago.)

One fundamental error here is that all locks are by design meant to be opened. Furthermore, the common thinking ignores the many failures (and subsequent losses) before any "super lock" might appear. We also observe that few will be able to afford any super-duper unpickable locks, or carry the 20-pound key necessary to operate them. Technological protections must be combined with social and legal controls to be effective.

This leads to our second major point.

Imagine if that burglary occurred at your house, and you suffered a major loss because the agent of the crime discovered your artwork or coin collection. We would not dream of blaming you for the loss, somehow implying that you were responsible by having your possessions stored in your house. If somebody were to say anything like that, we would reproach them for blaming/shaming the victim. Society in general would not try to castigate you for having possessions that others might want to steal.

Unfortunately, many computer professionals (and too many others, outside the profession) have the mindset that crimes on computers are somehow the fault of the victim (and this has been the case for many years). We must stop blaming the victims in cases such as this, especially when what they were doing was not illegal. We see criticism of their activities instead of the privacy invasion as blaming/shaming no less atrocious as that of how rape victims are treated — and that is also usually badly, unfortunately.

If we give users lousy technology and tell them it is safe, they use it according to directions, and they do not understand its limitations, they should not be blamed for the consequences. That is true of any technology. The fault lies with the providers and those who provide vague assurances about it. Too bad we let those providers get away with legally disclaiming all responsibility.

We are sympathetic to the argument that these exposures of images should perhaps be considered as a sex crime. They were acts of taking something without permission that violated the privacy and perceptions of safety of the victim for the sexual gratification and sense of empowerment of the perpetrator (and possibly also other reasons). Revenge porn, stalking, assault, and rape are similar...and we should not blame the victims for those acts, either. The sexually-themed abuse of female journalists and bloggers is also in this category -- and if you aren't aware of it, then you should be: women who write things online that some disagree with will get threats of violence (including rape), get abusive and pornographic messaging and images (deluges of it), and called incredibly offensive names...sometimes for years. It is beyond bullying and into something that should be more actively abhorred.

Some male members of the cyber security community are particularly bad in their treatment of women, too.

Between the two of us, Sam and Spaf, we have served as professors, counselors, and one of us (Liles) as a law enforcement officer; we have well over 50 years combined professional experience with both victims and bad behavior of their abusers. We have counseled female students and colleagues who have been stalked and harassed online for years. They keep encountering law enforcement officials and technologists who ask "What are you doing to encourage this?" None of them encourage it, and some live in real terror 24x7 of what their stalkers will do next. Some have had pictures posted that are mortifying to them and their loved ones, they've lost jobs, had to move, withdraw from on-line fora, lost relationships, and even change their names and those of their children. This can last for years.

Sexual offenders blame the victim to absolve themselves of responsibility, and thus, guilt. "She was acting suggestively," "she was dressed that way," etc. If the people around them chime in and blame the victim, they are excusing the criminal -- they are reinforcing the idea that somehow the victim provoked it and the abuser "obviously couldn't help himself.” They thus add unwarranted guilt and shame to the victim while excusing the criminal. We generally reject this with offenses against children, realizing that the children are not responsible for being abused. We must stop blaming all adult victims (mostly female, but others also get abused this way), too.

Victim blaming (and the offensively named slut shaming -- these aren't "sluts," they are victimized women) must STOP. Do you side with privacy rights and protection of the public, or with the rapists and abusers? There is no defendable middle ground in these cases.

We are also horrified by the behavior of some of the media surrounding this case. The crimes have been labeled as leaks, which trivializes the purposeful invasion and degradation performed. Many outlets provided links to the pictures, as did at least one well-known blogger. That illustrates everything wrong about the paparazzi culture, expressed via computer. To present these acts as somehow accidental (“leak”) and blame the victims not only compounds the damage, but glosses over the underlying story — this appears to be the result of a long-term criminal conspiracy of peeping toms using technologies to gather information for the purpose of attacking the privacy of women. This has allegedly been going on for years and law enforcement has apparently had almost no previous interest in the cases — why isn’t that the lead story? The purposeful exploitation of computer systems and exposure of people's private information is criminal. Some pundits only began to indicate concern when it was discovered that some of the pictures were of children.

It is clear we have a long way to go as a society. We need to do a better job of building strong technology and then deploying it so that it can be used correctly. We need to come up with better social norms and legal controls to hold miscreants accountable. We need better tools and training for law enforcement to investigate cyber crimes without also creating openings for them to be the ones who are regularly violating privacy. We need to find better ways of informing the public how to make cyber risk-related decisions.

But most of all, we need to find our collective hearts. Instead of idealizing and idolizing the technology with which we labor, deflecting criticisms for faults onto victims and vendors, we need to do a much better job of understanding the humanity — including both strengths and weaknesses — of the people who depend on that technology. The privacy violations, credit card fraud, espionage, harassment, and identity thefts all have real people as victims. Cyber security is, at its core, protecting people, and the sooner we all take that to heart, the better.

Videos from the 15th Annual CERIAS Symposium

We are now releasing videos of our sessions at this year’s CERIAS Symposium from late March.

We had a fascinating session with David Medine, chair of the PCLOB discussing privacy and government surveillance with Mark Rasch, currently the CPO for SAIC. If you are interested in the issues of security, counterterrorism, privacy, and/or government surveillance, you will probably find this interesting:

We are also making available videos of some of our other speakers — Amy Hess, Exec. Deputy Director of the FBI; George Kurtz, President & CEO of CrowdStrike; Josh Corman, CTO of Sonatype; and two of our other panel sessions:

(You have to put up with my introductions of the speakers, but into every life a little rain must fall.)

That was the 15th Annual CERIAS Symposium. Planning for the 16th Symposium is underway for March 24 & 25, 2015:

Update on “Patching is Not Security”

A few weeks ago, I wrote a post entitled “Patching Is Not Security.” Among other elements, I described a bug in some Linksys routers that was not patched and was supporting the Moon worm.

Today, I received word that the same unpatched flaw in the router is being used to support DDOS attacks. These are not likely to be seen by the owners/operators of the routers because all the traffic involved is external to their networks — it is outbound from the router and is therefore “invisible” to most tools. About all they might see is some slowdown in their connectivity.

Here’s some of the details, courtesy of Brett Glass, the ISP operator who originally found the worm on some customer routers; I have replaced hostnames with VICTIM and ROUTER in his account:

Today, a user reported a slow connection and we tapped in with a packet sniffer to investigate. The user had a public, static IP on a Linksys E1000, with remote administration enabled on TCP port 8080. The router was directing SYN floods against several targets on the Telus network in Canada. For example:

10:00:44.544036 IP ROUTER.3070 > VICTIM.8080: Flags [S],
seq 3182338706, win 5680, options [mss 1420,sackOK,TS val 44990601 ecr 0,nop,scale 0], length 0
10:00:44.573042 IP ROUTER.3071 > VICTIM.8080: Flags [S],
seq 3180615688, win 5680, options [mss 1420,sackOK,TS val 44990603 ecr 0,nop,scale 0], length 0
10:00:44.575908 IP ROUTER.3077 > VICTIM.8080: Flags [S], se
q 3185404669, win 5680, options [mss 1420,sackOK,TS val 44990604 ecr 0,nop,scale 0], length 0
10:00:44.693528 IP ROUTER.3072 > VICTIM.8080: Flags [S],
seq 3188188011, win 5680, options [mss 1420,sackOK,TS val 44990616 ecr 0,nop,scale 0], length 0
10:00:44.713312 IP v ROUTER.3073 > VICTIM.http: Flags [S],
seq 3174550053, win 5680, options [mss 1420,sackOK,TS val 44990618 ecr 0,nop,scale 0], length 0
10:00:45.544854 IP ROUTER.3078 > VICTIM.http: Flags [S],
seq 3192591720, win 5680, options [mss 1420,sackOK,TS val 44990701 ecr 0,nop,scale 0], length 0
10:00:45.564454 IP ROUTER.3079 > VICTIM.http: Flags [S],
seq 3183453748, win 5680, options [mss 1420,sackOK,TS val 44990703 ecr 0,nop,scale 0], length 0
10:00:45.694227 IP ROUTER.3080 > VICTIM.http: Flags [S],
seq 3189966250, win 5680, options [mss 1420,sackOK,TS val 44990716 ecr 0,nop,scale 0], length 0
10:00:45.725956 IP ROUTER.3081 > VICTIM.8080: Flags [S], se
q 3184379372, win 5680, options [mss 1420,sackOK,TS val 44990719 ecr 0,nop,scale 0], length 0
10:00:45.983883 IP ROUTER.3074 > VICTIM.8080: Flags [S],
seq 3186948470, win 5680, options [mss 1420,sackOK,TS val 44990745 ecr 0,nop,scale 0], length 0
10:00:46.985034 IP ROUTER.3082 > VICTIM.http: Flags [S],
seq 3194003065, win 5680, options [mss 1420,sackOK,TS val 44990845 ecr 0,nop,scale 0], length 0

In short, the vulnerability used by the "Moon" worm is no longer being used just to experiment; it's being used to enlist routers in botnets and actively attack targets.

One interesting thing we found about this most recent exploit is that the DNS settings on the routers were permanently changed. The router was set to use domain name servers at the addresses


The "Moon" worm was completely ephemeral and did not change the contents of flash memory (either the configuration or the firmware). The exploit I found today changes at least the DNS settings.

Shame on Belkin for dragging their feet on getting a fix out to the public. But more to the point, this is yet another example why relying on patching to provide security is fundamentally a Bad Thing.

Why We Don’t Have Secure Systems Yet, Introduction

Over the past couple of months I’ve been giving an evolving talk on why we don’t yet have secure systems, despite over 50 years of work in the field. I first gave this at an NSF futures workshop, and will give it a few more times this summer and fall.

As I was last reviewing my notes, it occurred to me that many of the themes I’ve spoken about have been included in past posts here in the blog, and are things I’ve been talking about for nearly my entire career. It’s disappointing how little progress I’ve seen on so many fronts. The products on the market, and the “experts” who get paid big salaries to be corporate and government advisors and who get the excessive press coverage, also serve to depress.

My current thinking is to write a series of blog posts to summarize my thinking on this general topic. I’m not sure how many I’ll write, but I have a list of probable topics already in mind. They break out roughly into (in approximate order of presentation):

  • Definition & metrics
  • History
  • Changes in technology
  • Research & Development
  • Legacy and Inertia
  • Bad practices
  • Media & milieu focus
  • Funding
  • Law enforcement
  • National policies
  • International issues

Each of these will be of moderate length, with some references and links to material to read. If you’re interested in a preview, I recommend looking at some of my recent talks archived on YouTube, some of my past blog posts here, and oral histories of various pioneers in the field of infosec done by the Babbage Institute (including, perhaps, my own).

I’ll start with the first posting sometime in the next few days, after I get a little more caught up from my vacation. But I thought I’d make this post, first, to solicit feedback on ideas that people might like me to add to the list.

My first post will be about the definition of security — and why part of the problem is that we can’t very well fix something that we can’t reliably define and thus obviously don’t completely understand.

Patching is Not Security

I have long argued that the ability to patch something is not a security “feature” — whatever caused the need to patch is a failure. The only proper path to better security is to build the item so it doesn’t need patching — so the failure doesn’t occur, or has some built-in alternative protection.

This is, by the way, one of the reasons that open source is not “more secure” simply because the source is available for patching — the flaws are still there, and often the systems don’t get patched because they aren’t connected to any official patching and support regime. Others may be in locations or circumstances where they simply cannot be patched quickly — or perhaps not patched at all. That is also an argument against disclosure of some vulnerabilities unless they are known to be in play — if the vulnerability is disclosed but cannot be patched on critical systems, it simply endangers those systems. Heartbleed is an example of this, especially as it is being found in embedded systems that may not be easily patched.

But there is another problem with relying on patching — when the responsible parties are unable or unwilling to provide a patch, and that is especially the case when the vulnerability is being actively exploited.

In late January, a network worm was discovered that was exploiting a vulnerability in Linksys routers. The worm was reported to the vendor and some CERT teams. A group at the Internet Storm Center analyzed the worm, and named it TheMoon. They identified vulnerabilities in scripts associated with Linksys E-series and N-series routers that allowed the worm to propagate, and for the devices to be misused.

Linksys published instructions on their website to reduce the threat, but it is not a fix, according to reports from affected users — especially for those who want to use remote administration. At the time, a posting at Linksys claimed a firmware fix would be published “in the coming weeks."

Fast forward to today, three months later, and a fix has yet to be published, according to Brett Glass, the discoverer of the original worm.

Complicating the fix may be the fact that Belkin acquired Linksys. Belkin does not have a spotless reputation for customer relations; this certainly doesn’t help. I have been copied on several emails from Mr. Glass to personnel at Belkin, and none have received replies. It may well be that they have decided that it is not worth the cost of building, testing, and distributing a fix.

I have heard that some users are replacing their vulnerable systems with those by vendors who have greater responsiveness to their customers’ security concerns. However, this requires capital expenses, and not all customers are in a position to do this. Smaller users may prefer to continue to use their equipment despite the compromise (it doesn’t obviously endanger them — as yet), and naive users simply may not know about the problem (or believe it has been fixed).

At this point we have vulnerable systems, the vendor is not providing a fix, the vulnerability is being exploited and is widely known, and the system involved is in widespread use. Of what use is patching in such a circumstance? How is patching better than having properly designed and tested the product in the first place?

Of course, that isn’t the only question that comes to mind. For instance, who is responsible for fixing the situation — either by getting a patch out and installed, or replacing the vulnerable infrastructure? And who pays? Fixing problems is not free.

Ultimately, we all pay because we do not appropriately value security from the start. That conclusion can be drawn from incidents small (individual machine) to medium (e.g., the Target thefts) to very large (government-sponsored thefts). One wonders what it will take to change that? How do we patch peoples’ bad attitudes about security — or better yet, how do we build in a better attitude?

In Memorium: Wyatt Starnes

William Wyatt Starnes passed away unexpectedly on May 10th, 2014 at the age of 59. Wyatt was a serial entrepreneur, known for his work in computing — and especially cyber protection — as well as for his mentorship and public service.

Wyatt graduated from Ygnacio Valley High School in Concord, CA, in 1972, and then obtained an Associates Degree from the Control Data Institute. His first full-time job was at Data General, and he went on to hold technical positions with Monolithic Memories, Maruman Integrated Circuits, and then Megatest Corporation. While at Megatest, Wyatt moved into management, where he showed significant expertise, and was eventually promoted to VP of Sales and Marketing. He subsequently moved to Tokyo for several years as the President of Megatest Japan. Although the remainder of his career was in management positions, he continued to work in technology, and was named as inventor or co-inventor of a number of patents in later years.

Upon leaving Megatest, Wyatt moved to Portland, Oregon, where he lived for the rest of his life. In Portland, he worked for several firms before founding his own company, Eclipse Technologies, Inc., and then Infinite Pictures. During that time, he met Gene Kim (one of my former students). Wyatt then founded Visual Computing, Inc., with Gene. They had originally planned on producing an immersive MMORPG named “Piggyland.” (I still have some of the marketing literature for this!) It used some novel technology and a great deal of humor, but before it had progressed very far, a series of coincidences led them to start Tripwire Security Services as a subsidiary, to produce software to secure MMORPGs and similar games. In short order, it became clear that Tripwire was the real path to success, and they transformed Infinite Pictures and TSS into Tripwire, Inc.

Wyatt was the CEO of Tripwire from 1997 to 2004 (Gene was CTO). In 2004, after a bout with cancer weakened him and forced him to step down from managing Tripwire, Wyatt founded the first version of the company SignaCert, and served as its CEO for the next six years. In 2010, SignaCert was acquired by Harris Corporation, and Wyatt served as the VP of Advanced Concepts and CTO for Cyber until 2012, when he retired. (NB. SignaCert has since begun a “second life” after being sold by Harris.) Over his career, Wyatt also served on the boards of Swan Island Networks of Portland, Oregon; Comprehensive Intelligence Technology Training Corporation of Annapolis, Maryland; and Symbium Software of Ottawa, Ontario.

During his 15 year career as a leading executive in cyber security, Wyatt was a driven and passionate advocate for better security and better design. He spoke at industry and community events, and was asked to join several high-level government and industry advisory boards, including  TechAmerica Foundation’s CLOUD2 Commission, NIST’s Visiting Committee on Advance Technologies (VCAT), and the Oregon Executive Council of the American Electronics Association (AeA), among others. In Portland, he was cofounder of the innovative RAINS network (Regional Alliances for Infrastructure and Network Security), a nonprofit public/private alliance (now defunct) formed to accelerate development, deployment and adoption of innovative technology for homeland security.

Wyatt was known for business acumen with a human touch — he cared about the people who worked for him, his customers, and the world around him. He made time for others when they needed it, and that is a rare quality in someone serving as a CEO. Although highly focused on his business duties, Wyatt was seemingly always willing to lend a smile, and listen to what others had to say. He was also known for his fondness for good wine and good humor.

As the designer of the original Tripwire and SignaCert offerings, I have known and worked with Wyatt for nearly 20 years. When he was undergoing treatment for his life-threatening condition in the mid-2000s, we had many conversations about the nature of existence and the future. Then, and throughout the time I knew him, Wyatt expressed a strong commitment to living in the present — to not put off things (including people) that might then be forgotten…and regretted.

Some people believe that exiting life with the largest bank account is success. Wyatt believed that making the world a better place was true success. He wrote in his LinkedIn profile under “Awards and Honors”

My reward comes from the special opportunity to do something important that (hopefully) leaves the world a better place.
And it is an honor to share what I have learned with others that aspire to create lasting contributions with their lives.

By those measures, he clearly was a huge success — his companies, his advocacy, his mentoring, and his friendship changed the lives of many, many people for the better. Wyatt Starnes will be greatly missed.

Some other media accounts of Wyatt’s passing:

A Special Opportunity to Support CERIAS

Purdue University is a land-grant university, founded in 1869. As a land-grant university, our focus has always been on service to the public good — providing excellent education and research results for the betterment of the world around us. While many universities take great pride at their faculty’s leverage of research to launch new companies or publish many academic papers, we’ve always been very focused on delivering a truly world-class education and performing “game changer” discovery.

Purdue Day of Giving

The Purdue community just celebrated a reunion of astronaut alumni — a visible symbol of the spirit of service and exploration inherent in our makeup. Purdue is the alma mater of more astronauts than any other university; the first and last men to walk on the moon were Purdue alumni. They did not do it for profit or fame — they did what they did to advance science, to push back boundaries of ignorance, and to give others something to dream about. Purdue’s story is full of people like that, from around the nation and around the world. Our students come from well over a hundred countries, and our graduates go out to improve the lives of people in at least that number.

Our history of exploration and being there “first” extends to many other area, including the first degree-granting CS department (founded in 1962), the first dedicated freshman engineering program, the first television broadcast, and having the fastest campus supercomputer in the world. (A few other notable firsts are detailed here and here .)

But more to the point of this blog, Purdue is the location of CERIAS — the first multidisciplinary institute in cyber security and privacy research, and the home of the first defined degree in information security.

CERIAS is not a department within the university. We are a cross-cutting, multidisciplinary institute at the university, supported largely with soft funds: the vast majority of our funding has always come from small, outside donations by companies and foundations. Our finances depend on the generosity of others, but we are structured so as to not be beholden to the government or one or two big commercial entities that can dictate the direction of our efforts. Instead, we investigate those ideas that our faculty think will solve real problems and help others in what they want to do. Some of our organizational donors are partners in our program, providing advice and research assistance for our efforts, and they reap the rewards in new hires and new ideas (see the link for information on how your organization can join the program).

Historically, we have not done much to solicit others to support CERIAS, although it has always been possible for anyone to make a donation. But that will change, for one special day, April 30th. And we would like everyone who cares about our mission and our future to consider making a donation, even if it is only a small amount.

The first-ever Purdue Day of Giving, a 24-hour online event designed to boost Purdue visibility and support, will take place Wednesday, April 30. CERIAS, and many other campus units, will be promoting Purdue efforts -- granting opportunities, launching dreams, and achieving greatness while promoting an affordable and accessible Purdue.

Plus, every (tax-deductible in the US, at least) donation to CERIAS will receive an additional percentage match from the University. Thus, your donation on April 30th will support CERIAS at even a great extent than your donation alone! This is a special one-day-only opportunity for your gift, large or small! Also, If your employer does charitable matches, please be sure to let them know to match your donation, thus, increasing your impact even further!

Your donation can be made through the website (click on “CERIAS” near the bottom of the page), by texting “PurdueCERIAS” (case non-sensitive) to 41444 (you will receive a reply text with more details) or by the telephone at 1-800-319-2199.

But the Purdue Day of Giving is much more than an opportunity to support CERIAS; it’s about helping spread the word about us, our great history and our brighter future along with Purdue's drive to re-define college education. If you’re associated with Purdue and whether you make a donation or not, you can help by posting your story -- or sharing/re-tweeting one of ours – in social media; just add @cerias and #PurdueDayofGiving to your posts and tweets. The University has contests and incentives in place for CERIAS and other units who have friends and alumni posting about #PurdueDayofGiving.

Track our progress and enjoy the day-long series of announcements and highlight videos (one of them featuring on a certain bearded professor known for his fondness of bowties) at Don’t wait until April 30 to join the fun; visit now, view videos of some of the exciting student success stories, plus sign up for an email to remind you on the 30th to pay it forward. And please, pass along a link to this blog entry to others who you think might be interested in helping.

Thank you to all of our friends, alumni, and partners for their past support, and thank you in advance for helping to “spread the word.” We do hope that you will take this opportunity to provide a donation that day — even if it’s a small one — to help us advance our work towards a more safe and security future.