I have continued to update my earlier post about women in cybersecurity. Recent additions include links to some scholarship opportunities offered by ACSA and the (ISC)2 Foundation. Both scholarship opportunities have deadlines in the coming weeks, so look at them soon if you are interested.
The 15th Annual Security Symposium is less than a month away! Registration is still open but filling quickly. If you register for the Symposium, or for the 9th ICCWS held immediately prior, you can get a discount on the other event. Thus, you should think about attending both and saving on the registration costs! See the link for more details.
I periodically post an item to better define my various social media presences. If you follow me (Spaf) and either wonder why I post in multiple venues, or want to read even more of my musings, then take a look at it.
I ran across one of my old entries in this blog — from October 2007 — that had predictions for the future of the field. In rereading them, I think I did pretty well, although some of the predictions were rather obvious. What do you think?
Sometime in the next week or so (assuming the polar vortex and ice giants don’t get me) I will post some of my reflections on the RSA 2014 conference. However, if you want a sneak peek at what I think about what I saw on the display floor and after listening to some of the talks, you can read another of my old blog entries — things haven’t changed much.
The Charles Babbage Institute at the University of Minnesota is devoted to research and preservation of the history of computing. They have amassed an interesting collection of literature and memorabilia that shows the history of the field.
One of the projects associated with the CBI is to gather oral histories of notable figures in the field of computing security. They have some fascinating oral histories of people including Willis Ware, Peter Neumann, Becky Bace, Roger Schell, Donn Parker and others, as well as lots of oral histories in other subfields of computing. You can find the full set online.
Late last year, Jeff Yost of the CBI visited Purdue to conduct an interview with me. He got a lot of material out of me, including some anecdotes that I don’t think I have ever related to anyone else before. We spent a good portion of a day going through this. It’s long.
I question how many people might really want to read through the whole thing, but if you’re interested in some of the history of the security program at Purdue, how I ended up at Purdue, my start in software engineering, my initial work in digital forensics, how I got involved in security, or any of a bunch of other topics likely to be of little or no interest to most people, then you can check out my oral history at CBI.
I’ve mentioned a lot of students, colleagues, and influences by name. If you’re one of them, I hope what I said doesn’t bother you! (Unless I intended it to bother you, in which case….
I don’t think I said anything unduly embarrassing, and I’m actually happy to have documented some of the history of how CERIAS got started. So, if that kind of thing floats your boat (or balances your parity), then check it out.
Four days -- two major events!
We're living in a time of transition. Cyberthreats are increasing and becoming more sophisticated, victimized organizations are cooperating with competitors and fighting back, and the discussion of expected privacy has become front-page news. These topics, and more, will be explored at the 15th Annual CERIAS Security Symposium. Join the conversation amongst academic educators and researchers, commercial R&D engineers, government researchers, and industry practitioners as we examine the current state, possible solutions and emerging technologies addressing issues of information assurance, security, privacy and cybercrime.
CERIAS Symposium activities will include:
Featuring a selection of the 60+ projects currently in progress by by CERIAS faculty and students. Meet the researchers while hearing about their work.
The event has a number of built-in opportunities for social and professional networking, and exploration of new opportunities. CERIAS partners will be provided an exclusive opportunity for recruiting CERIAS students for internships and employment; non-partners can find out more about joining the CERIAS consortium. Attendees may also schedule other visits and tours while on campus.
CERIAS Symposium attendees are invited to join the ICCWS conference being held the two days prior to the CERIAS Symposium. The ICCWS provides an opportunity for the cyber warfare and security community of interest and practice to gather and exchange their views on the current state of the security research, governance and implementation. The conference is intended to draw an audience of practitioners, researchers, consultants and regulators from academia, business and government.
CERIAS Symposium attendees will receive a discount off ICCWS registration. For more information on ICCWS-2014 visit:
We hope to see you at Purdue the week of March 24!
I received news today that Yves Deswarte passed away on January 27th.Dr. Deswarte was a notable member of the computing community, with a career of 30+ years as an educator, researcher, and manager. His career as a computing research pioneer spanned issues ranging from fault-tolerant computing to microcomputer systems to networking to issues of identity and privacy to system safety, and more. His most recent affiliation was with theLAAS-CNRS; the Laboratory for Analysis of Architecture of Systems at the French National Center for Research in Toulouse. He also had been an engineer and manager at INRIA, and spent time with SRI and at Microsoft Labs in Cambridge (with the late Roger Needham). Some of his more recent work involved the security of cloud and embedded systems.
Yves was the deserving recipient of the 2012 IFIP TC-11 Kristian Beckman Award and an award for Outstanding Service to IFIP. His acceptance address for the Beckman was devoted to issues of identity and privacy — topics which had been central to some of his research in recent years. In addition to his research and his work with IFIP, Dr. Deswarte was also notable for his work with ESORICS, and for the Ph.D. students whose work he advised: his webpage lists 20 Ph.D. graduates advised, and 5 in progress.
A memorial page for Dr. Deswarte has been established at LAAS.
I only met Yves once or twice, and our work only occasionally brought us into contact. Interestingly, his path in computing had some parallels to mine — he was working fault-tolerant computing (the SURF project) about the time I was (as a grad student), and then moved into security and privacy issues. I have known of him and his work for most of my career in computing, but unfortunately did not have the opportunity to get to know him well in person. I am undoubtedly not doing justice to his many contributions with the meager account above, and I would welcome comments from those who knew him better.
I have written memorium pieces for many people in the field over the last few years, most recently Willis Ware. Yves is closer to my age than most of them, so that makes is a little more personal. It is a sign that the field is maturing as we begin to lose our colleagues, but that is hardly any solace.
R.I.P. Yves Deswarte, 1949-2014.
I’ve had several items cross my social media feeds, along with email, in the last few days that prompt me to write this. It’s gotten a bit longer than I intended, but there’s a lot to say on an important topic. As a first post to this blog in 2014, I think it is a good topic to address. It has to do with imbalance and bad behavior in the overall field of cybersecurity: the low percentage of women, and how they are sometimes treated.
Computing, as a field in the USA, has had a low and almost constantly decreasing percentage of women going into the field and staying. (The US is the primary focus of this blog entry; I believe the problem is similar in Canada, the UK, Australia, and others, but don’t have the data. Also, there is a corresponding problem with other traditional minorities, but that’s not what prompted this post and I hope to visit it later.). There are many reasons posited for this, many of which are likely somewhat to blame; there is no single, dominant reason, apparently. Many studies and reports have been conducted, experiments tried, and programs put into place, but few have made any measurable, long-term change. The problem is almost undoubtedly rooted in social behaviors and expectations because there are other cultures where the ratio of women to men is about 1:1, or even has women in larger percentages.
Cybersecurity is little different, and may be worse. I regularly speak at conferences, companies, and agencies where the room will have 30 men and one (or no) women. At events where there are speakers or panels, all the speakers and panelists are men. The few women attending often are simply the ones there processing registrations. And there are a nontrivial number of reports of women being groped and harassed at professional meetings (see, for instance, this). Also bad, women are frequently abused online as well as offline, and not only in security and computing. Many are reluctant to publish email addresses or contact info online because of unwanted, inappropriate content sent to them — no matter whether they’re 8 or 80.
(Right now, if you are thinking to yourself that there isn’t a real problem, that things are fine, and it is all a problem of some women who can’t take a joke, then you are part of the problem, and you need to shape up. Worse, if you think that women shouldn’t be upset about this status quo, instead they should get back to the kitchen, then you are so out of touch that I don’t know where to start. In either case, try telling that same thing to women doctors, pilots, police, firefighters, or better yet, to our many women in the military — especially when your safety is in their care. Then come back when you’ve healed up. If nothing else, at least keep in mind that there are legal reasons to treat people equally and with respect.)
Assuming you are actually living in the 21st century, let me assure you that the overall situation is a HUGE problem for us. As a field, and as a society this is bad because we have a shortage of talent that is getting worse with time. We also have some rather skewed and limited ideas of how to approach problems that might benefit from a more inclusive pool of designers and practitioners. And as human beings we should be concerned — especially those of us who are sons, brothers, fathers, and husbands — people who could be (and sometimes are) our mothers, sisters, daughters, and wives are being mistreated and demeaned. That simply isn’t right. Neither is it right that we are limiting the opportunities for individuals to learn, grow, and achieve.
Computing, security, privacy, creativity — those are all traits of the mind. Minds exist in all kinds of bodies, including those with other colors, more or fewer curves, different masses and volumes, varied ages, and some have less physical abilities than others. But that doesn’t change what is possible in their minds! We should applaud ability, dedication, and imagination wherever we find it. Discouraging women (or anyone with ability) from pursuing a career in computing, abusing them online, and groping them at conferences are all counterproductive to our own futures &emdash; as if rude and wrong wasn't enough. Cybersecurity and privacy are key areas where we need more insight and creativity — we should enhance it rather than diminish it.
No field is populated only with superstars and wild talents. That is especially true in IT. We hear about people with great accomplishments, and we like to think we’re special in our way, but the truth is that the field is too large for any individuals to master it. Success comes from teams, and the most successful teams are those that integrate many different viewpoints, backgrounds, and skill sets, and who respect their differences yet work with common goals. That includes bringing in people from different genders, ethnicities, ages, and more. Success is enhanced by diversity.
I’m not going to go through a longer litany of problems here, or try to analyze the situation further. I’ve been working with various women’s groups for over 20 years and I still don’t pretend to be able to understand all of what is happening. It is complex. However, I see the problem continuously when I look at our student body, when I visit professional meetings, and when I read reports. I know it is real.
What I can do, is offer some advice to those who care.
Here are some general tips that should be common sense.
The basic idea here is really embodied in #8. Be thoughtful and don't treat anyone as substantially different Instead, relate to every person as a professional. But most of all, speak up if you see someone getting picked on or treated badly, or if they aren’t getting encouragement they should. It’s like security and privacy itself — an attack on any link is an attack on the whole, and if a link falls we are all diminished.
There is debate within many minority communities of whether aligning with self-interest groups is helpful. On the plus side, the mentoring, the support resources, and the sense of community can all be a big help. However, that also runs the risk of not sufficiently engaging in the mixed environment where one has to work, of developing unrealistic expectations based on anecdotal stories, and failing to help educate the majority in how to help. There seems to be enough positive “buzz” about some groups and their activities to warrant recommending them. Not all are likely to fit your own particular needs and interests, so check them out. If you know of some I have missed, please let me know so I can add them here.
The (ISC)2 is organizing a women’s special interest group. I have spoken with organizers , but am unsure of the status of it at this time.
The Women in Cyber Security conference will be held in April in Nashville. I know nothing about it other than what is on their web page, but it looks like it could be a great experience.
Of course, please keep in mind that not all men are the same! Many want to do the right things but aren’t always sure what is appropriate. Help train a few.
From a professional point of view, being a member of ACM and ISSA is good idea for anyone in the field, based simply on the value of the organizations. Both promote professionalism, community, and personal growth, and there are a variety of other benefits to membership. Both have steep discounts for student members. I am a long-standing member of both, and can recommend them.
Our society has a lot of problems with cybersecurity and privacy. New flaws show up, and old flaws don’t really get fixed. Parties ranging from individual criminals to nation-state organizations are all seeking ways to penetrate our systems and mess with our information. We need every good person we can get on board and working together if we hope to make progress. We should make every effort to enable that partnership.
Or think of it in these terms: if we can’t be trusted to protect and empower those within our own community, why should anyone trust us to protect anything else?
Updated 1/7: Added a few list items about mentoring and language, listed ISACA, small grammatical corrections.
Updated 1/8: Corrected several typos
Updated 1/10: Added ISSA group link. Added comment from Anita Jones; this is the memo she mentions in that comment.
Updated 1/14: Small grammatical corrections.
Updated 1/22: Added ACM-W page link
Updated 1/24: Added the Systers link
Updated 2/16: Added link to subscribe to the ACM-W list. Minor grammatical cleanup.
Updated 3/2: Added links to ACSA and (ISC)2 scholarship information.
If you have any additions or corrections to the above lists, please send me private email. Also note that, as usual, anonymous, spammy, or abusive feedback to the blog may not be published as is, if at all.
Willis H. Ware, a highly respected and admired pioneer in the fields of computing security and privacy, passed away on November 22nd, 2013, aged 93.Born August 31,1920, Mr. Ware received a BSEE from the University of Pennsylvania (1941), and an SM in EE from MIT (1942). He worked on classified radar and IFF (identify friend or foe) electronic systems during WWII. After the war he received his Ph.D. in EE from Princeton University (1951) while working at the Institute for Advanced Studies for John von Neumann, building an early computer system.
Upon receiving his Ph.D., Dr. Ware took a position with North American Aviation (now part of Boeing Corporation). After a year, he joined the RAND Corporation (in 1952) where he stayed for the remainder of his career -- 40 more years — and thereafter as an emeritus computer scientist. His first task at RAND was helping to build the "Johnniac," an early computer system. During his career at RAND he advanced to senior leadership positions, eventually becoming the chairman of the Computer Science Department.
Willis was influential in many aspects of computing. As an educator, he initiated and taught one of the first computing courses, at UCLA, and wrote some of the field's first textbooks. In professional activities, he was involved in early activities of the ACM, and was the founding president of AFIPS (American Federation of Information Processing Societies). From 1958-1959 he served as chairman of the IRE Group on computers, a forerunner of the current Computer Society of the IEEE. He served as the Vice Chair of IFIP TC 11 from 1985-1994. At the time of his death he was still serving as a member of the EPIC Advisory Board.
Dr. Ware chaired several influential studies, including one in 1967 that produced a groundbreaking and transformational report to the Defense Science Board for ARPA (now DARPA) that was known thereafter as "The Ware Report." To this day, some of the material in that report could be applied to better understand and protect computing systems security. The follow-on work to that study eventually led, albeit somewhat indirectly, to the development of the NCSC "Rainbow Series" of publications. (The NCSC, National Computer Security Center, was a public-facing portion of the NSA ,serving as an office for improving security in commercial products.)
In 1972, Dr. Ware was tapped to chair the Advisory Committee on Automated Personal Data Systems for the HEW (now HHS) Secretary. That report, and Willis's subsequent paper,"Records, Computers, and the Rights of Citizens," established the first version of the Code of Fair Information Practices. That, in turn, significantly influenced the Privacy Act of 1974, and many subsequent versions of fair information practices. The Privacy Act mandated the creation of the Privacy Protection Study Commission, of which Dr. Ware was vice chair.
Willis was the first chairman of the Information System and Privacy Advisory Board, created by the Computer Security Act of 1987. He remained chairman of that board for 11 years following its establishment. Over the years, Dr. Ware served on many other advisory boards, including the US Air Force Scientific Advisory Board, the NSA Scientific Advisory Board, and over 30 National Research Council boards and committees.
Willis Ware was one of the most honored professionals in computing. He was a Member of the National Academy of Engineering, and was a Fellow of the AAAS, of the IEEE, and of the ACM — perhaps the first person to accrue all four honors. He was a recipient of the IEEE Centennial Medal in 1984, the IEEE Computer Pioneer Award in 1993, and a USAF Exceptional Civilian Service Medal in 1979. He was the recipient of the NIST/NSA National Computer System Security Award in 1989, the IFIP Kristian Beckman Award in 1999, a lifetime achievement award from the Electronic Privacy Information Center (2012), and was inducted into the Cyber Security Hall of Fame in 2013.
Dr. Willis H. Ware was truly a pioneer computer scientist, an early innovator in computing education, one of the founders of the field of computer security, and an early proponent of the need to understand appropriate use of computing and the importance of privacy. His dedication to the field and the public interest was both exceptional and seminal.
The Rand Corporation posted an in memorium piece on their website.
(Any updates or corrections will be posted here as they become available.)
Update 10/26: included acronym expansions of IFF and NCSC, along with links for NCSC and HHS. Added small grammatical corrections.
Update 10/29: added the note and link to the Rand Corporation in memorium piece.
Update 12/9: added the mention of the DSB
On October 9th, 2013, I delivered one of the keynote addresses at the ISSA International Conference. I included a number of observations on computing, security, education, hacking, malware, women in computing, and the future of cyber security.
You can see a recording of my talk on YouTube or view it here. You might find it somewhat amusing. See the old guy with the bow tie ramble on.
(If you work in cyber security, you should think about joining the ISSA.)
(Also, if you didn't know, I have two other blogs. One blog is a Tumblr blog feed of various media stories about security, privacy and cybercrime. The other blog is about various personal items that aren't really related to CERIAS, or even necessarily to cyber security — some serious, some not so much.)
Over the last month or two I have received several invitations to go speak about cyber security. Perhaps the up-tick in invitations is because of the allegations by Edward Snowden and their implications for cyber security. Or maybe it is because news of my recent awards has caught their attention. It could be it is simply to hear about something other than the (latest) puerile behavior by too many of our representatives in Congress and I'm an alternative chosen at random. Whatever the cause, I am tempted to accept many of these invitations on the theory that if I refuse too many invitations, people will stop asking, and then I wouldn't get to meet as many interesting people.
As I've been thinking about what topics I might speak about, I've been looking back though the archive of talks I've given over the last few decades. It's a reminder of how many things we, as a field, knew about a long time ago but have been ignored by the vendors and authorities. It's also depressing to realize how little impact I, personally, have had on the practice of information security during my career. But, it has also led me to reflect on some anniversaries this year (that happens to us old folk). I'll mention three in particular here, and may use others in some future blogs.
In early November of 1988 the world awoke to news of the first major, large-scale Internet incident. Some self-propagating software had spread around the nascent Internet, causing system crashes, slow-downs, and massive uncertainty. It was really big news. Dubbed the "Internet Worm," it served as an inspiration for many malware authors and vandals, and a wake-up call for security professionals. I recall very well giving talks on the topic for the next few years to many diverse audiences about how we must begin to think about structuring systems to be resistant to such attacks.
Flash forward to today. We don't see the flashy, widespread damage of worm programs any more, such as what Nimda and Code Red caused. Instead, we have more stealthy botnets that infiltrate millions of machines and use them for spam, DDOS, and harassment. The problem has gotten larger and worse, although in a manner that hides some of its magnitude from the casual observer. However, the damage is there; don't try to tell the folks at Saudi Aramaco or Qatar's Rasgas that network malware isn't a concern any more! Worrisomely, experts working with SCADA systems around the world are increasingly warning how vulnerable they might be to similar attacks in the future.
Computer viruses and malware of all sorts first notably appeared "in the wild" in 1982. By 1988 there were about a dozen in circulation. Those of us advocating for more care in design, programming and use of computers were not heeded in the head-long rush to get computing available on every desktop (and more) at the lowest possible cost. Thus, we now have (literally) tens of millions of distinct versions of malware known to security companies, with millions more appearing every year. And unsafe practices are still commonplace -- 25 years after that Internet Worm.
For the second anniversary, consider 10 years ago. The Computing Research Association, with support from the NSF, convened a workshop of experts in security to consider some Grand Challenges in information security. It took a full 3 days, but we came up with four solid Grand Challenges (it is worth reading the full report and (possibly) watching the video):
I would argue -- without much opposition from anyone knowledgeable, I daresay -- that we have not made any measurable progress against any of these goals, and have probably lost ground in at least two.
Why is that? Largely economics, and bad understanding of what good security involves. The economics aspect is that no one really cares about security -- enough. If security was important, companies would really invest in it. However, they don't want to part with all the legacy software and systems they have, so instead they keep stumbling forward and hope someone comes up with magic fairy dust they can buy to make everything better.
The government doesn't really care about good security, either. We've seen that the government is allegedly spending quite a bit on intercepting communications and implanting backdoors into systems, which is certainly not making our systems safer. And the DOD has a history of huge investment into information warfare resources, including buying and building weapons based on unpatched, undisclosed vulnerabilities. That's offense, not defense. Funding for education and advanced research is probably two orders of magnitude below what it really should be if there was a national intent to develop a secure infrastructure.
As far as understanding security goes, too many people still think that the ability to patch systems quickly is somehow the approach to security nirvana, and that constructing layers and layers of add-on security measures is the path to enlightenment. I no longer cringe when I hear someone who is adept at crafting system exploits referred to as a "cyber security expert," but so long as that is accepted as what the field is all about there is little hope of real progress. As J.R.R. Tolkien once wrote, "He that breaks a thing to find out what it is has left the path of wisdom." So long as people think that system penetration is a necessary skill for cyber security, we will stay on that wrong path.
And that is a great segue into the last of my three anniversary recognitions. Consider this quote (one of my favorite) from 1973 -- 40 years ago -- from a USAF report, Preliminary Notes on the Design of Secure Military Computer Systems, by a then-young Roger Schell:
…From a practical standpoint the security problem will remain as long as manufacturers remain committed to current system architectures, produced without a firm requirement for security. As long as there is support for ad hoc fixes and security packages for these inadequate designs and as long as the illusory results of penetration teams are accepted as demonstrations of a computer system security, proper security will not be a reality.
That was something we knew 40 years ago. To read it today is to realize that the field of practice hasn't progressed in any appreciable way in three decades, except we are now also stressing the wrong skills in developing the next generation of expertise.
Maybe I'll rethink that whole idea of going to give a talks on security and simply send them each a video loop of me banging my head against a wall.
PS -- happy 10th annual National Cyber Security Awareness Month -- a freebie fourth anniversary! But consider: if cyber security were really important, wouldn't we be aware of that every month? The fact that we need to promote awareness of it is proof it isn't taken seriously. Thanks, DHS!
Now, where can I find I good wall that doesn't already have dents from my forehead....?
In the June 17, 2013 online interview with Edward Snowden, there was this exchange:
I simply thought I'd point out a statement of mine that first appeared in print in 1997 on page 9 of Web Security & Commerce (1st edition, O'Reilly, 1997, S. Garfinkel & G. Spafford):
Secure web servers are the equivalent of heavy armored cars. The problem is, they are being used to transfer rolls of coins and checks written in crayon by people on park benches to merchants doing business in cardboard boxes from beneath highway bridges. Further, the roads are subject to random detours, anyone with a screwdriver can control the traffic lights, and there are no police.
I originally came up with an abbreviated version of this quote during an invited presentation at SuperComputing 95 (December of 1995) in San Diego. The quote at that time was everything up to the "Further...." and was in reference to using encryption, not secure WWW servers.
A great deal of what people are surprised about now should not be a surprise -- some of us have been lecturing about elements of it for decades. I think Cassandra was a cyber security professor....
[Added 9/10: This also reminded me of a post from a couple of years ago. The more things change....]
Last post, we wrote about the NSA‟s secret program to obtain and then analyze the telephone metadata relating to foreign espionage and terrorism by obtaining the telephone metadata relating to everyone. In this post, we will discuss a darker, but somewhat less troubling program called PRISM. As described in public media as leaked PowerPoint slides, PRISM and its progeny is a program to permit the NSA, with approval of the super-secret Foreign Intelligence Surveillance Court (FISC) to obtain “direct access” to the servers of internet companies (e.g., AOL, Google, Microsoft, Skype, and Dropbox) to search for information related to foreign terrorism – or more accurately, terrorism and espionage by “non US persons.”
Whether you believe that PRISM is a wonderful program narrowly designed to protect Americans from terrorist attacks or a massive government conspiracy to gather intimate information to thwart Americans political views, or even a conspiracy to run a false-flag operation to start a space war against alien invaders, what the program actually is, and how it is regulated, depends on how the program operates. When Sir Isaac Newton published his work Opticks in 1704, he described how a PRISM could be used to – well, shed some light on the nature of electromagnetic radiation. Whether you believe that the Booz Allen leaker was a hero, or whether you believe that he should be given the full Theon Greyjoy for treason, there is little doubt that he has sparked a necessary conversation about the nature of privacy and data mining. President Obama is right when he says that, to achieve the proper balance we need to have a conversation. To have a conversation, we have to have some knowledge of the programs we are discussing.
Unlike the telephony metadata, the PRISM programs involve a different character of information, obtained in a potentially different manner. As reported, the PRISM programs involve not only metadata (header, source, location, destination, etc.) but also content information (e-mails, chats, messages, stored files, photographs, videos, audio recordings, and even interception of voice and video Skype calls.)
Courts (including the FISA Court) treat content information differently from “header”information. For example, when the government investigated the ricin-laced letters sent to President Obama and NYC Mayor Michael Bloomberg, they reportedly used the U.S. Postal Service‟s Mail Isolation Control and Tracking (MICT) system which photographs the outside of every letter or parcel sent through the mails – metadata. When Congress passed the Communications Assistance to Law Enforcement Act (CALEA), which among other things established procedures for law enforcement agencies to get access to both “traffic” (non-content) and content information, the FBI took the posistion that it could, without a wiretap order, engage in what it called “Post-cut-through dialed digit extraction” -- that is, when you call your bank and it prompts you to enter your bank account number and password, the FBI wanted to “extract” that information (Office of Information Retrival) as “traffic” not “content.” So the lines between “content” and “non-content”may be blurry. Moreover, with enough context, we can infer content. As Justice Sotomeyor observed in the 2012 GPS privacy case:
… it may be necessary to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties. E.g., Smith, 442 U.S., at 742, 99 S.Ct. 2577; United States v. Miller, 425 U.S. 435, 443, 96 S.Ct. 1619, 48 L.Ed.2d 71 (1976). This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks. People disclose the phone numbers that they dial or text to their cellular providers; the URLs that they visit and the e-mail addresses with which they correspond to their Internet service providers; and the books, groceries, and medications they purchase to online retailers.
But the PRISM program is clearly designed to focus on content. Thus, parts of the Supreme Court‟s holding in Smith v. Maryland that people have no expectation of privacy in the numbers called, etc. therefore does not apply to the PRISM-type information. Right?
Again, not so fast.
Simple question. Do you have a reasonable expectation of privacy in the contents of your e-mail?
Short answer: Yes.
Longer answer: No.
Better answer: Vis a vis whom, and for what purposes. You see, privacy is not black and white. It is multispectral – you know, like light through a triangular piece of glass.
When the government was conducting a criminal investigation of the manufacturer of Enzyte (smiling Bob and his gigantic – um – putter) they subpoenaed his e-mails from, among others, Yahoo! The key word here is subpoena – not search warrant. Now that‟s the thing about data and databases -- if information exists it can be subpoenaed. In fact, a Florida man has now demanded production of cell location data from – you guessed it – the NSA.
But content information is different from other information. And cloud information is different. The telephone records are the records of the phone company about how you used their service. The contents of emails and documents stored in the cloud are your records of which the provider has incidental custody. It would be like the government subpoenaing your landlord for the contents of your apartment (they could, of course subpoena you for this, but then you would know), or subpoenaing the U-stor-it for the contents of your storage locker (sparking a real storage war). They could, with probable cause and a warrant, seach the locker (if you have a warrant, I guess you‟re cooing to come in), but a subpoena to a third party is dicey.
So the Enzyte guy had his records subpoenaed. This was done pursuant to the stored communications act which permits it. The government argued that they didn‟t need a search warrant to read Enzyte guy‟s email, because – you guessed it – he had no expectation of privacy in the contents of his mail. Hell, he stored it unencrypted with a thjird party. Remember Smith v. Maryland? The phone company case? You trust a third party with your records, you risk exposure. Or as Senator Blutarsky (I. NH?) might opine, “you ()*^#)( up, you trusted us…”(actually Otter said that, with apologies to Animal House fans.)
Besides, cloud provider contracts, and email and internet provider privacy policies frequently limit privacy rights of users. In the Enzyte case, the government argued that terms of service that permitted scanning of the contents of email for viruses or spam (or in the case of Gmail or others, embedding context based ads) meant that the user of the email service “consented” to have his or her mail read, and therefore had no privacy rights in the content. (“Yahoo! reserves the right in their sole discretion to pre-screen, refuse, or move any Content that is available via the Service.”) Terms of service which provided that the ISP would respond to lawful subpoenas made them a “joint custodian” of your email and other records (like your roommate) who could consent to the production of your communications or files. Those policies that your employer has that says, “employees have no expectation of privacy in their emails or files"? While you thought that meant that your boss (and the IT guy) can read your emails, the FBI or NSA may take the position that “no expectation of privacy” means exactly that.
Fortunately, most courts don’t go so far. In general, courts have held that the contents of communications and information stored privately online (not on publicly accessible Facebook or Twitter feeds) are entitled to legal protection even if they are in the hands of potentially untrustworthy third parties. But this is by no means assured.
But clearly the data in the PRISM case is more sensitive and entitled to a greater level of legal protection than that in the telephony metadata case. That doesn‟t mean that the government, with a court order, can't search or obtain it. It means that companies like Google and Facebook probably can't just “give it” to the government. I''s not their data.
So the NSA wants to have access to information in a massive database. They may want to read the contents of an email, a file stored on Dropbox, whatever. They may want to track a credit card through the credit card clearing process, or a banking transaction through the interbank funds transfer network. They may want to track travel records – planes, trains or automobiles. All of this information is contained in massive databases or storage facilities held by third parties – usually commercial entities. Banks. VISA/MasterCard. Airlines. Google.
The information can be tremendously useful. The NSA may have lawful authority (a Court order) to obtain it. But there is a practical problem. How does the NSA quickly and efficiently seek and obtain this information from a variety of sources without tipping those sources off about the individual searches it is conducting – information which itself is classified? That appears to be the problem attempted to be solved by PRISM programs.
In the telephony program, the NSA “solved” the problem by simply taking custody of the database.
In PRISM, they apparently did not. And that is a good thing. The databases remain the custody of those who created them.
Here‟s where it gets dicey – factually.
The reports about PRISM indicate that the NSA had “direct access” to the servers of all of these Internet companies. Reports have been circulating that the NSA had similar “direct access” to financial and credit card databases as well. The Internet companies have all issued emphatic denials. So what gives?
Speculation time. The NSA and Internet companies could be outright lying. David Drummond, Google‟s Chief Legal Officer aint going to jail for this. Second, they could be reinterpreting the term “direct” access. When General Alexander testified under oath that the NSA did not “collect any type of data on millions of Americans” he took the term “collect” to mean “read” rather than “obtain.”
Most likely, however, is that the NSA PRISM program is a protocol for the NSA, with FISC approval, to task the computers at these Internet companies to perform a search. This tasking is most likely indirect. How it works is, at this point, rank speculation. What is likely is that an NSA analyst, say in Honolulu, wants to get the communications (postings, YouTube videos, stored communications, whatever) of Abu Nazir, a non-US person, which are stored on a server in the U.S., or stored on a server in the Cloud operated by a US company. The analyst gets “approval” for the “search,” by which I mean that a flock of lawyers from the NSA, FBI and DOJ descend (what is the plural of lawyers? [ a "plague"? --spaf] ) and review the request to ensure that it asks for info about a non US person, that it meets the other FISA requirements, that there is minimization, etc. Then the request is transmitted to the FISC for a warrant. Maybe. Or maybe the FISC has approved the searches in bulk (raising the Writ of Assistance issue we described in the previous post.) We don‟t know. But assuming that the FISC approves the “search,” the request has to be transmitted to, say Google, for their lawyers to review, and then the data transmitted back to the NSA. To the analyst in Honolulu, it may look like “direct access.” I type in a search, and voilia! Results show up on the screen. It is this process that appears to be within the purview of PRISM. It may be a protocol for effectuating court-approved access to information in a database, not direct access to the database.
Or maybe not. Maybe it is a direct pipe into the servers, which the NSA can task, and for which the NSA can simply suck out the entire database and perform their own data analytics. Doubtful, but who knows? That‟s the problem with rank speculation. Aliens, anyone?
But are basing this analysis on what we believe is reasonable to assume.
So, is it legal? Situation murky. Ask again later.
If the FISC approves the search, with a warrant, within the scope of the NSA‟s authority, on a non-US person, with minimization, then it is legal in the U.S., while probably violating the hell out of most EU and other data privacy laws. But that is the nature of the FISA law and the USA PATRIOT Act which amended it. Like the PowerPoint slides said, most internet traffic travels through the U.S., which means we have the ability (and under USA PATRIOT, the authority) to search it.
While the PRISM programs are targeted at much more sensitive content information, if conducted as described above, they actually present fewer domestic legal issues than the telephony metadata case. If they are a dragnet, or if the NSA is actually conducting data mining on these databases to identify potential targets, then there is a bigger issue.
The government has indicated that they may release an unclassified version of at least one FISC opinion related to this subject. That‟s a good thing. Other redacted legal opinions should also be released so we can have the debate President Obama has called for. And let some light pass through this PRISM.
† Mark Rasch, is the former head of the United States Department of Justice Computer Crime Unit, where he helped develop the department’s guidelines for computer crimes related to investigations, forensics and evidence gathering. Mr. Rasch is currently a principal with Rasch Technology and Cyberlaw and specializes in computer security and privacy.
‡ Sophia Hannah has a BS degree in Physics with a minor in Computer Science and has worked in scientific research, information technology, and as a computer programmer. She currently manages projects with Rasch Technology and Cyberlaw and researches a variety of topics in cyberlaw.
Rasch Cyberlaw (301) 547-6925 www.raschcyber.com