What is wrong with all of you? Reflections on nude pictures, victim shaming, and cyber security

Page Content


[This blog post was co-authored by Professor Samuel Liles and Spaf.]

Over the last few days we have seen a considerable flow of news and social media coverage of untended exposure of celebrity photographs (e.g., here). Many (most?) of these photos were of attractive females in varying states of undress, and this undoubtedly added to the buzz.

We have seen commentary from some in the field of cybersecurity, as well as more generally-focused pundits, stating that the subjects of these photos “should have known better.” These commentators claim that it is generally known that passwords/cloud storage/phones have weak security, so the victims only have themselves to blame.

We find these kinds of comments ill-informed, disappointing, and in many cases, repugnant.

First, we note that the victims of these leaks were not trained in cyber security or computing. When someone falls ill, we don’t blame her for not having performed studies in advanced medicine. When someone’s car breaks down, we don’t blame him for failing to have a degree in mechanical engineering. Few people are deeply versed in fields well outside their profession.

The people involved in these unauthorized exposures apparently took prudent measures they were instructed to on the systems as they were designed. As an example, the passwords used must have passed the checks in place or they would not have been able to set them. It doesn’t matter if we approve of the passwords that passed those automated checks -- they were appropriate to the technical controls in place. What the people stored, how they did it, or the social bias against their state of dress has nothing to do with this particular issue.

Quite simply, the protection mechanisms were not up to the level of the information being protected. That is not the users’ fault. They were using market standards, represented as being secure. For instance, it is admitted that Apple products were among those involved (and that is the example in some of our links). People have been told for almost a decade that the iOS and Apple ecosystem is much more secure than other environments. That may or may not be true, but it certainly doesn’t take into account the persistent, group effort that appears to have been involved in this episode, or some of the other criminal deviants working in groups online. We have met a few actresses and models, and many young people. They don’t think of risk in the same way security professionals do, and having them depend on standard technology alone is clearly insufficient against such a determined threat.

Consider: assume you live in a nice house. You’ve got windows, doors, and locks on those windows and doors. You likely have some kind of curtains or window coverings. If you live in a house, even a house with no yard, if you close your curtains we accept that as a request for privacy. If I walk up on the sidewalk and attempt to peer into your windows, that is being a “peeping tom.” Even though I might have every right to stand on the sidewalk, face the direction I’m looking, and stop or pause, I do not have the right to violate your privacy.

Consider: Your house has a nice solid door with a nice lock. That lock likely has orders of magnitude less entropy than a password. Every night you walk through your house, lock your doors, and you go to sleep believing you are likely safe. Yet, that lock and that door will not stop a group of determined, well-equipped attackers or likely even slow them down. The police will not arrive for some amount of time and effective self-protection against unexpected provocation by a gang is uncertain, at best. As a polite and law-abiding society, we respect the door and the lock, and expect others to do the same. We understand that the door and lock keep honest people honest. They set a barrier to entry for criminals. Burglaries still happen and we use the power of law to enact punishment against criminals, although many crimes go unsolved.

If an unauthorized entry to your house occurs, whether by picking the lock, climbing through the window, or discovering a loose set of boards in the wall, we would never blame you, the victim — it is clear that the person who entered, unbidden, was to blame. Some of our peers would try to blame the companies that made the locks and windows, rather than acknowledge the bad intent of the burglar. Too many people in information security tend to think we can always build better locks, or that having “white hats” continually picking locks somehow will lead to them being unpickable. Many are so enamored of the technology of those locks and the pastime of picking them that they will blame the victim instead of anything to do with the production or design of the locks themselves. (This is not a new phenomenon: Spafford wrote about this topic 22 years ago.)

One fundamental error here is that all locks are by design meant to be opened. Furthermore, the common thinking ignores the many failures (and subsequent losses) before any "super lock" might appear. We also observe that few will be able to afford any super-duper unpickable locks, or carry the 20-pound key necessary to operate them. Technological protections must be combined with social and legal controls to be effective.

This leads to our second major point.

Imagine if that burglary occurred at your house, and you suffered a major loss because the agent of the crime discovered your artwork or coin collection. We would not dream of blaming you for the loss, somehow implying that you were responsible by having your possessions stored in your house. If somebody were to say anything like that, we would reproach them for blaming/shaming the victim. Society in general would not try to castigate you for having possessions that others might want to steal.

Unfortunately, many computer professionals (and too many others, outside the profession) have the mindset that crimes on computers are somehow the fault of the victim (and this has been the case for many years). We must stop blaming the victims in cases such as this, especially when what they were doing was not illegal. We see criticism of their activities instead of the privacy invasion as blaming/shaming no less atrocious as that of how rape victims are treated — and that is also usually badly, unfortunately.

If we give users lousy technology and tell them it is safe, they use it according to directions, and they do not understand its limitations, they should not be blamed for the consequences. That is true of any technology. The fault lies with the providers and those who provide vague assurances about it. Too bad we let those providers get away with legally disclaiming all responsibility.

We are sympathetic to the argument that these exposures of images should perhaps be considered as a sex crime. They were acts of taking something without permission that violated the privacy and perceptions of safety of the victim for the sexual gratification and sense of empowerment of the perpetrator (and possibly also other reasons). Revenge porn, stalking, assault, and rape are similar...and we should not blame the victims for those acts, either. The sexually-themed abuse of female journalists and bloggers is also in this category -- and if you aren't aware of it, then you should be: women who write things online that some disagree with will get threats of violence (including rape), get abusive and pornographic messaging and images (deluges of it), and called incredibly offensive names...sometimes for years. It is beyond bullying and into something that should be more actively abhorred.

Some male members of the cyber security community are particularly bad in their treatment of women, too.

Between the two of us, Sam and Spaf, we have served as professors, counselors, and one of us (Liles) as a law enforcement officer; we have well over 50 years combined professional experience with both victims and bad behavior of their abusers. We have counseled female students and colleagues who have been stalked and harassed online for years. They keep encountering law enforcement officials and technologists who ask "What are you doing to encourage this?" None of them encourage it, and some live in real terror 24x7 of what their stalkers will do next. Some have had pictures posted that are mortifying to them and their loved ones, they've lost jobs, had to move, withdraw from on-line fora, lost relationships, and even change their names and those of their children. This can last for years.

Sexual offenders blame the victim to absolve themselves of responsibility, and thus, guilt. "She was acting suggestively," "she was dressed that way," etc. If the people around them chime in and blame the victim, they are excusing the criminal -- they are reinforcing the idea that somehow the victim provoked it and the abuser "obviously couldn't help himself.” They thus add unwarranted guilt and shame to the victim while excusing the criminal. We generally reject this with offenses against children, realizing that the children are not responsible for being abused. We must stop blaming all adult victims (mostly female, but others also get abused this way), too.

Victim blaming (and the offensively named slut shaming -- these aren't "sluts," they are victimized women) must STOP. Do you side with privacy rights and protection of the public, or with the rapists and abusers? There is no defendable middle ground in these cases.

We are also horrified by the behavior of some of the media surrounding this case. The crimes have been labeled as leaks, which trivializes the purposeful invasion and degradation performed. Many outlets provided links to the pictures, as did at least one well-known blogger. That illustrates everything wrong about the paparazzi culture, expressed via computer. To present these acts as somehow accidental (“leak”) and blame the victims not only compounds the damage, but glosses over the underlying story — this appears to be the result of a long-term criminal conspiracy of peeping toms using technologies to gather information for the purpose of attacking the privacy of women. This has allegedly been going on for years and law enforcement has apparently had almost no previous interest in the cases — why isn’t that the lead story? The purposeful exploitation of computer systems and exposure of people's private information is criminal. Some pundits only began to indicate concern when it was discovered that some of the pictures were of children.

It is clear we have a long way to go as a society. We need to do a better job of building strong technology and then deploying it so that it can be used correctly. We need to come up with better social norms and legal controls to hold miscreants accountable. We need better tools and training for law enforcement to investigate cyber crimes without also creating openings for them to be the ones who are regularly violating privacy. We need to find better ways of informing the public how to make cyber risk-related decisions.

But most of all, we need to find our collective hearts. Instead of idealizing and idolizing the technology with which we labor, deflecting criticisms for faults onto victims and vendors, we need to do a much better job of understanding the humanity — including both strengths and weaknesses — of the people who depend on that technology. The privacy violations, credit card fraud, espionage, harassment, and identity thefts all have real people as victims. Cyber security is, at its core, protecting people, and the sooner we all take that to heart, the better.


It’s common sense that a woman should never go jogging by herself at 2AM in a rough neighborhood. Like it or not, as wrong as it is, she is asking for trouble.

I certainly don’t believe the movie stars deserve this anymore than the female jogger, but there needs to be some common sense on their part as well.

How many decades has it been with women jogging at night and being attacked? How many decades has it been that sex tapes have been found of famous people?

People should have the freedom to do both without any of these kinds of issues, but since these issues DO exist, use common sense.

You missed the first point of our post—vendors and software provide a sense of security that, within common sense, is not realistic.  They THOUGHT they were being prudent and cautious.

And we should all be working to change things so that it is equally safe for ANYONE to go jogging, or upload pictures.  That there is a differential is itself a problem.

Posted by Jeff on Thursday, September 4, 2014 at 11:35 PM

I disagree with you. You don’t have to be a physicist to understand that drying your cat in the microwave moght be a bad idea. Putting your data into the cloud means it’s entrusted to a third party with no liability whatsoever in case it’s leaked. Liability would be the only incentive to make this more secure


spaf sea:  Few people really know what “in the cloud” means.  Even within computing, the term is fuzzy.  Sometimes, simply having backup switched on for your cellphone triggers “the cloud.”  It isn’t at all obvious.

Second of all, even if “the cloud” is understood, hazily, why would anyone automatically assume it wasn’t safe?  It is marketed as secure, and everyone uses it. 

This is why we wrote that it is a failure because the technologists’ ground truths are not at all obvious or known by the general public.  Actually, vendors don’t want to convey the complete truth as it would hurt their sales!

Liability alone isn’t the answer—liability is intended to provide some recompense after an incident.  That would be small solace in a case such as this.

And last of all, there is a reason why some microwaves have warnings not to put live things in them, and why lawnmowers have labels such as “Do not use as a hedge trimmer” —the bottom end of common sense in the general population is pretty darn low.

Posted by Jupp Müller on Thursday, September 4, 2014 at 11:50 PM

Thank you for writing a sane article regarding this disgraceful episode.  I have long felt that technologists appear to live in a parallel universe to “Normal” people.  Stolen is stolen, abuse is abuse and breaking in to someones property is exactly that regardless of whether its digital or physical.  Using obscure technical terms to disguise the digitial act is not acceptable.

Posted by Frank O'Kelly on Friday, September 5, 2014 at 04:15 AM

Thank you for writing this, Spaf, & Prof Liles. It’s great to be able to add the voice of someone of your stature in the industry to the chorus of people, inside & outside the industry, who’re tired of the victim-blaming that invariably occurs after each of these incidents.

Posted by Lionel on Friday, September 5, 2014 at 12:12 PM

It’s pretty amazing but not unexpected: the first comment on your posting begins the victim-blaming.

One of the things I think is exacerbating the problem is the police-state’s efforts to redefine privacy (an absolute right) to a “reasonable expectation of privacy” and then to game that in order to - not to put spin on it - violate citizens’ privacy. It does not help AT ALL that the FBI/NSA/CIA/GCHQ/CSE/etc are trying to push the idea that you have no expectation of privacy if you put your data on someone else’ server. It helps create a culture of “what isn’t nailed down, I can take. and if I can pry it up, it isn’t nailed down.”

Posted by Marcus J Ranum on Friday, September 5, 2014 at 02:45 PM

The first comment to this post is sad and unfortunate that it perpetuates the idea of blaming the victim as well as naively assuming everyone knows the ins and outs of the technology the way he does.  However, what struck me more was his insulting comments that understanding security should just be “common sense” to this women and because they were victimized does that mean they don’t have common sense around security?  Sad. 

I think an important place for us all to begin is to remind ourselves who the “average” internet users are; because they are not security professionals and for many they are totally unaware of the power that exist in the palm of their hands.  Sad it takes horrible, life-changing events to force some people to re-frame their behaviors with technology.  Victims of cyber crime are forever changed - something I hope law enforcement officials working with victims will embrace.  Because it’s not something someone will just get over. 

What can we do?  We need more disciplines to the conversation about how to make technology safer.  We need everyone around the table to figure out how we help raise the awareness level and usability of security so it can be more “common sense”-like for the average user.  Because, bear in mind, anyone who has spent anytime in information security is no longer an “average user” - something often forgotten by us.  But we have a vital role to play because for us this IS common sense.  Yet, how can we when within our own industry we don’t even treat each other with respect regardless of technical knowledge or gender? 

Thank you Spaf and Prof. Liles for writing such a thoughtful and illustrative piece.

Posted by Kelley on Friday, September 5, 2014 at 04:10 PM

Great article—thank you.

I largely agree with you, but there is more to most such situations. The bottom-line is that we have a lot of catching up to do with the consequences of newer/cooler.

Posted by Vic Winkler on Saturday, September 6, 2014 at 08:22 AM

I feel this article suffers from some myopia regarding the issue. It misses several important perspectives, while placing disproportionate emphasis on a select few.

I do not believe it is OK to “victim blame” in the sense that this article portrays it.

That said, I believe this article makes some ethical/moral errors:

#1: It claims or gives the impression that all of the victims in this case held zero responsibility for what happened to their data. This is untrue. The victims do not hold all of the blame, nor do are they completely innocent. Those who point out that it was unwise of the victims to hand over private data to random, untrustworthy third-parties, have a valid point that should be respected and acknowledged in this article.

#2: It plays fast and loose with the pronoun “we” to refer to disparate groups while giving the impression that these different groups are the same groups.

To expand on that last point, here are the two groups being inappropriately conflated:

- The “we” of this sentence: “If we give users lousy technology and tell them it is safe, they use it according to directions, and they do not understand its limitations, they should not be blamed for the consequences.”

- And the “we” of this sentence: “Unfortunately, many computer professionals (and too many others, outside the profession) have the mindset that crimes on computers are somehow the fault of the victim (and this has been the case for many years). We must stop blaming the victims in cases such as this, especially when what they were doing was not illegal”

This distinction is blurred in the article through the use of imprecise language, and the author’s decision to ignore stronger, more nuanced perspectives.

Allow me to separate the two groups:

- The “we” who is “[giving] users lousy technology” *should* be blamed. In this case, it is a very specific “we”: Apple, and more generally, it is Apple and any company that fails to sufficiently invest in the security of user’s data.

- Then there is the “we” who criticize Apple for their shortcomings, and additionally try to educate and discourage users from falling prey to these types of situations by placing their actions in starker, more readily understandable terms. For example, I see nothing wrong with telling victims and would-be future victims to understand that when they upload their data to *any* cloud service that does not encrypt their data end-to-end, it is the same as handing their personal, private information to complete and total strangers. This is literally true, and it should be pointed out to the victims and those who can potentially learn from the victim’s mistakes.

These two groups are different, and they are by no means the only groups that the pronoun “we” can refer to.

The right thing to do is to acknowledge all perspectives, as almost all of them have something valuable to share with us (and yes, that “us” refers to *all* of us).

So in that spirit: thank you, professors, for sharing your perspectives. I agree with it: victim-blaming is not good, but victim-educating, I believe, is *right* and healthy.

Spaf sez:

Thanks for the feedback.  I agree the use of “we” was meant multiple ways, but Sam and I identify with different groups at times, so…

Almost EVERY consumer company has failed to invest properly in security and quality, and all (to some extent, passively or otherwise) present their wares as secure.  That was a major point of our essay — the average person cannot know if something is not secure “enough.”  You make the same error in saying that they understand about encryption — most computing users don’t understand encryption, and how can they be sure it is implemented properly and without leakage?  There are numerous instances in the literature where even experts have believed a system “good enough” only to be proven wrong.  So, your statement does get back to blaming the victim if they are not educated enough to investigate and thoroughly test their systems.  That is not a viable model, nor is it possible for most people.

Yes, people can learn from this incident, and from the Home Depot breach, and the JP Morgan breach, and the HealthCare.gov breach and…. (that’s only the last two weeks!).  But what is it they should learn? To not use the Internet?  When Apple, JP Morgan, Home Depot, and the US Government all suffer breaches with trained personnel who look at the systems, what is it the average user is supposed to learn?

There is a terrible myopia among those who work with high-tech, who believe that everyone is able to (or should be able to) understand how to use the technology correctly.  Although a crude illustration, consider that the average IQ is 100 — half of the population is at or below that level.  Think they can really verify ECB-mode AES, understand dynamic IP, test their biometric reader, and configure a firewall?  However, that is the attitude you (and many others) express. 

If WE (the community, society, technologists) are going to make these products widely available, we have a duty to understand the limitations of the potential users, and build the systems to be operable — and appropriately safe — for all of them. 

Posted by Greg Slepak on Saturday, September 6, 2014 at 06:06 PM

So may i start of by saying i do “victim shame” in this situation as they are not a victim. a celebrity is their own business and there product is them self. While celebrities are a human beings, to confuse them with non celebs is a little annoying. These people have established groups to prune and protect the business (of celebrity). We did not think of target as a victim because the 3rd parties they utilized had security vulnerabilities. Auditors are right now heavily going after businesses to make sure their 3rd parties meet rigid security standards. So if celebrity is a business, the persons image, likeness, and availability is their business then why assume safety to web services when everything else is strictly vetted as much as possible. Due diligence is need to protect the product.

First off we must appreciate the capitalist and the marketers of the world… Right don’t these web services talk about how they are the best, number 1, must secure, must trust worthy, always available, and etc, etc, etc… Then use pictures to reinforce this idea like locks, safes, vaults, security guard, and progress bars showing how much the data is being secured. Kinda like people / businesses have brainwashed to believe that “lousy technology” is super! And if something happens it not to them, or just meta data…

There is a quote from a article that i like and i think fits the topic:

The fantasy of cyberspace is “serious” because it is cognitively necessary. It relieves us of the burden of having to parse the seemingly infinite complexity of the systems that make such communication possible. This kind of fantasizing is a counterpart to what sociologist Anthony Giddens calls the “bargain with modernity.” Giddens believes that our modern lives are characterized by an endless series of risks (e.g., driving a car, stepping into an elevator, taking medicine, etc.) because it is not possible for any one individual to understand all the minute complexities of the myriad technological systems we depend on every day, we must place our trust (and our very lives) in the hands of experts. Trust is our basic conscious mechanism for dealing with such complexities; denial is its unconscious counterpart. Fantasy gives substance to this denial. Part of the seductiveness of the cyberspace fantasy is that, by denying the complex, mutually determining relationship between our society and the Web, it makes our lives and our everyday judgments simpler.  -

The infosec community and web services dont help to define what Trust in cyberspace is. Trust at this point is supposed to be implicit when it comes to technology in general. No thought process at all. blind acceptance through convenience! People / Business should know they failed. Doesn’t the saying go “learn from your mistakes”. Though instead of just shaming infosec pro’s maybe should help define the rules of the road and teaching better process on how to protect their (people/businesses) IP so an individual can make informed decision on who they give their trust to and how to reassess trust when the environment changes.

Frankly the worst thing i find about this is that this was a post on spaf blog… All that is going on with the government, violations of privacy, and the legal manipulation to let it all be possible. Yet spaf blog talks about celebrity nude and not to be mean. Also that like the many times before when celebrity IP is unjustly taken they will only be about me, me, me… Pleading to congress, LEO’s, etc to protect celebrities. Now they could use this as a platform to teach people, how not to fall pray like they did. get connected with privacy advocate and learn and petition change to make changes for the greater good (everyone), but no that will not happen. Yet lets just imagine the kinda pull say Jennifer Lawrence could have if she did a PSA for the FED’s / LEO’s that was intended for children and maybe why it’s not a good idea to fall pray to “sexting” and maybe some other good security heuristics that any child on the net should know these day.

The second half of this post has a whole other scope associated with it and will leave it.

Spaf sez:

Ashley, I had trouble parsing some of your comments because of your confusing grammar, so I’m not sure if I will address all of your points with this.  But thank you for commenting.

Those people ARE victims, and the logic you are using — that they are somehow “special” — is part of the problem of perception we described.  Every group is “special” in some way.  When any group is subject to faulty systems and criminal behavior, they are victims.  In this case you seem to try to argue that they are somehow business entities and that is an exception.  However, the material taken was not taken from business accounts, but from personal ones.  People in the public eye sacrifice some privacy (usually unwillingly) but that does not mean they forfeit personal space and personal rights.

Your quote from Giddens is a good one.  I don’t see how the material that follows it is related, though, and germane.  There are many of us in the field who have been trying — for decades — to get the word out about trust, security, and the market failures in play.  We don’t get much traction.  The people who seem to get the most publicity (and support) are the ones with shiny new products, and who poke holes in the existing, lousy structure.  Those of us who have some understanding of the base of trust, risk-based decision-making, and security assurance are largely ignored.  There is a social decision that has been made to accept large-scale risk rather than spend what it takes to build secure systems, and this is part of the fall-out.

As far as your last comment, about this being in my (actually, the CERIAS) blog… why is that a problem?  This is about an incident in the news, and it illustrates some well-known failures.  We did not advocate for special treatment of the victims — only that victims, as a class, not be blamed for the failures of the products they depend on, any more than we would blame the passengers on a plane that crashed, or blame people who bought food from a supermarket then contracted food poisoning.  The issue of permeability of protections is cross-cutting — it enables too much exposure to governments, marketers, and criminals.  That we didn’t write a whole textbook on the problems (the post was probably too long as it was) doesn’t mean we don’t speak out about it in other venues at other times. 

Posted by Ashley on Saturday, September 6, 2014 at 10:56 PM

Dear Spaf,

Thanks for the reply.

Regarding this:

> “Think they can really verify ECB-mode AES, understand dynamic IP, test their biometric reader, and configure a firewall?  However, that is the attitude you (and many others) express.”

That is not my attitude, please do not ascribe it to me (or anyone else for that matter).

I do not know of a single person in the infosec world who holds that point of view. I don’t know of anyone who honestly believes that average users are supposed to readily understand and/or investigate the encryption details that a company uses.

We can throw up straw men all day and beat the schnitzel out of them, but that doesn’t seem like a productive use of anyone’s time.

> “But what is it they should learn? To not use the Internet?  When Apple, JP Morgan, Home Depot, and the US Government all suffer breaches with trained personnel who look at the systems, what is it the average user is supposed to learn?”

I stated clearly what they should learn in my response. Here it is again: “[..] it was unwise of the victims to hand over private data to random, untrustworthy third-parties [..]”

That is the basic lesson here.

We need to establish and promote metaphors that people can relate to. The one you used in the article, that of homes and private belongings, is a fine one.

It is our responsibility to explain to “average users” that in this situation, it would be like taking a stash of nude Polaroids and handing them to a random store manager in order to give them to your friend.

Instead of doing that, use software that is purpose-built to handle sensitive data. People place valuables in sealed containers, call up FedEx and UPS, and have them transfer them across the country. There is an infosec equivalent of this, and users should learn the terms to watch out for (like “encryption”, “end-to-end encrypted”, “even we can’t read your data”, etc.).

The alternative, is to throw all the blame at companies who may or may not change what they do, and leave the victims in the dark as to what they should have done instead.

That seems untenable and unwise.


Spaf sez:  In your original comment, you wrote “For example, I see nothing wrong with telling victims and would-be future victims to understand that when they upload their data to *any* cloud service that does not encrypt their data end-to-end, it is the same as handing their personal, private information to complete and total strangers.”

To make that work, how do they know it encrypts end-to-end, and that the encryption (and storage, etc) are done correctly?  That is why I made the comment about analyzing AES, etc.  Very few users are able t make that analysis, so it requires trust in the provider.  The question is then what does the person need to know to place that trust?  Do they need to know that, for instance, DES is poor for encryption but AES-256 is not?  How do we teach that to auto mechanics, secretaries, actresses, and bus drivers (to name a few), along with IPv6 vs. IPv4 and IPSEC and….

Oh, and by the way, end-to-end encryption is hardly a guarantee of safety: storage at either end must be protected, key management must be sound, et al.

If we don’t have a trusted infrastructure then no one should trust any of their personal, private information to “the cloud” — and that includes you, your email, your phone backups, and your credit card info when you purchase anything, all your medical records, etc.  What is left that can be stored remotely with confidence?

I short-circuited that train of thought to derive the conclusion from your statement.  Sorry I wasn’t clearer about that.

As to what you said:  “[..] it was unwise of the victims to hand over private data to random, untrustworthy third-parties [..]”

But there is the problem — they WERE trusted parties (although NOT random).  The services were presented as trusted, and clearly those people (and millions of others) trusted them.  How do you know NOT to trust a service or vendor unless you are able to perform all those checks I mentioned, yourself?  Who do you trust?

People trust Home Depot and Goodwill and Sony and JP Morgan and their healthcare providers and…. They hand over their private data regularly — including financial information — precisely because they need to trust those services to interact with them.  Your advice (if heeded) is effectively that people should no longer interact online.  I agree they might be safer, but that does seem a bit extreme. grin

We cannot possibly educate the average user to all these nuances and issues.  That was one of the two major points of our essay.  If you think you can educate everyone about all the aspects of security and privacy, and all the variations across all implementations, then have at it!  I teach advanced students with backgrounds in computing and many of them still don’t get it.  I have no hope of educating the general public in all of this, especially given the variation in education and simple common sense.  And I am certainly not going to blame any of them for becoming victims because they are given faulty products to use, if they use them as intended.

If you want to discuss it more, catch Sam or me in person.

Posted by Greg Slepak on Sunday, September 7, 2014 at 08:40 PM

Great article, although one argument in there is probably incorrect. That is, the claim that too many in the industry think locks can always be improved and that think “white hats” will make them unbreakable.

It is not true that “too many people” in the industry think “white hats” looking will make the “lock” unpickable. It is certainly not an argument that is common in the industry. In fact, the opposite is true.

It is also disingenuous to say that “too many” people think locks can be improved. Whatever “locks” means in that context, making them better, in security usually means “making them less flawed”, and that can’t possibly be that bad, can it?

What I think you were trying to say was that “some people” that don’t work in the infosec industry (tech bloggers) think our existing security controls are nearly perfect and blame the victims for misusing them.

But let’s make sure we don’t accidentally make it seem like the argument is that vulnerability research and product security hardening is useless grin

Spaf sez:

No, the statement was actually meant as intended.  Most vendors do NOT invest in writing secure code — instead, they issue it and expect to fix flaws afterwards as a way of making the software stronger.  And as to the community — look at the various conferences — how many of the presentations are on better construction?  Most of the presentations are flaw analysis, intrusion detection, or putting another layer of firewall in place.  Some are devoted only to demonstrations of the latest hacks.  The majority of papers I read are by people who not only are NOT doing any research into secure coding or design methods, but into incremental improvement.

The analogy wasn’t perfect, but until we get away from glorifying breaking things and into using careful design and construction (for which methods already exist, if anyone would study them), we aren’t going to make progress.

And no, making systems less flawed is not bad, but it is by nature a poor approach compared to making the systems strong to begin with. 

Posted by Eduardo on Sunday, September 7, 2014 at 11:13 PM

As someone who has served as a law enforcement forensic examiner for a few years, I just wanted to express my appreciation for this perspective. Sharing photos on iCloud is a default setting on new iDevices. Most users are not aware of the feature nor the potential for compromise. There is no warning (except perhaps in a 50+ page mountain of legalese). Blaming a user for being breached in this scenario is equivalent to shaming a fraud victim for handing their credit card over to the wait staff of a restaurant in order to pay for their meal (while unbeknownst to them the staff member made a fraudulent purchase).

The irony in this is that many of these same folks who are quick to blame these celebrity victims are just as quick to condemn law enforcement, the NSA, and other government agencies for alleged privacy intrusions. Apparently invading women’s privacy in search of porn is a good excuse while ensuring national security is not.

Posted by Dan O'Day on Monday, September 8, 2014 at 01:47 AM

There is an aspect of liberal philosophy that I’ve always found baffling and it’s their tendency to affix blame everywhere but where it belongs. It’s the same thing with gun control. Their first reaction is to blame the victim or the gun or the gun owner or the gun manufacturer, but never the perp who actually pulled the trigger. These are the same people who will let the perp plea bargain his way down to probation assuming he is ever apprehended. And then they wonder why nobody takes the law seriously any more.

A “polite” society is no accident. It depends on sure and certain consequences when you break the rules. Unfortunately, since the 1960s liberal philosophy and policies have all but destroyed any connection between crime and punishment.

Posted by GreenPus on Tuesday, September 9, 2014 at 10:52 AM

In the old days, it would have been a shoebox full of polaroid pictures. Yes, someone had to break in to steal them, but if they didn’t exist, they couldn’t get stolen.

Just saying. If there aren’t any nekkid pics taken, none will turn up.

Spaf sez:

This is another form of blaming the victim: it is their fault because of the kinds of pictures they took.  But the same problem exists if they were love letters, or insurance bills, or inventories of cat toys.  They were using the technology they were given to do things that were not illegal or outside the technical capabilities of those devices.  It doesn’t matter what the bits represented — that they were taken from them without authorization, in part because of faults in the technology that should not have been there, is the issue.  We should NOT blame them for what they chose to represent in those bits.

To them, they were storing polaroids in that shoebox.  There was no intent to share.  Their privacy rights were violated the same as someone breaking into their houses and stealing a box full of pictures.  The mechanism and medium changed, but the violation is identical.

Posted by Amy Parker on Tuesday, September 9, 2014 at 07:40 PM

Professor, Spaf and Mr. O’Day,

Thank you so much for the article and post respectively!  It is very difficult for someone who is not in the profession and has only basic knowledge of the technology, thanks to the network defenders I work with and they are always very helpful, to wade through the security software jungle to find trusted products and figure out how to use them properly.  I sometimes feel like I need my own personal IT professional!

I don’t believe that I should have to be IT professional to protect my personal information on my computer, wherever it is being stored, but it really does feel that way in today’s environment.

Mr. O’Day, I completely agree with your post and I was glad to see the sentiment in the last paragraph.

Thanks to all of you for your defense of those of us who are having trouble finding our way through the technology jungle and not always getting it right!

Posted by Robin Scher on Tuesday, September 9, 2014 at 09:33 PM

Call me old school but I use my old crappy camera to take pictures. I have no faith in our technology anymore regarding security.

Posted by accedeholdings on Thursday, September 11, 2014 at 01:30 AM

I really applaud the original blog post by Spaf, but I also recognize—as stated by the post from “accede holdings”—that IT and consumer electronics are operating at a pace that is at odds with real quality and real security.  Consequently, my expectations for security are low.

When time to market and lower price points drive delivery, quality suffers. The triangle (secure, fast, cheap) is what it is.

Posted by Vic Winkler on Saturday, September 13, 2014 at 10:17 AM

Thank you for this commentary. Indeed the pebkac exists. Yet that problem between keyboard and chair is often the person choosing the defaults or designing the trust model.

Thanks again-

Posted by Jean Camp on Wednesday, September 17, 2014 at 08:25 PM

Great post, i think we have landed in an era where smart phones and social sharing are meant to be a good thing and in all seriousness is a positive experience when utilized moraly. But the media, ok now we are getting dirty, uses this to attrackt more readers, revenue and exposure. The media doesn’t care, it’s about the bottom line, the blue ink.

Posted by Jacky on Tuesday, November 25, 2014 at 02:46 AM

Great post,I hate nudity. It is destroying moral values. Media does not care about it. Thanks for sharing.


Thanks for the feedback.  I disagree that nudity is destroying moral values.  At most, it could be a symptom of some moral issues, if one believes that nudity is somehow wrong (and I, for one, do not).  We are all naked under our clothes, after all!

Posted by zoni on Thursday, November 27, 2014 at 08:28 AM

Leave a comment

Commenting is not available in this section entry.