CERIAS - Center for Education and Research in Information Assurance and Security

Skip Navigation
CERIAS Logo
Purdue University - Discovery Park
Center for Education and Research in Information Assurance and Security

CERIAS Blog

Page Content

Spaf videos, blasts from the past, future thoughts

Share:

I created a YouTube channel a while back, and began uploading my videos and linking in videos of me that were online. Yes, it’s a dedicated Spaf channel! However, I’m not on camera eating Tide pods, or doing odd skateboard stunts. This is a set of videos with my research and views over the years on information (cyber) security, research, education, and policies.

There are two playlists under the channel — one for interviews that people have conducted with me over the years, and the other being various conference and seminar talks.

One of the seminar talks was one I did at Bellcore on the Internet Worm — about 6 weeks after it occurred (yes, that’s 1988)! Many of my observations and recommendations in that talk seem remarkably current — which I don’t think is necessarily a good observation about how current practice has (not) evolved.

My most recent talk/video is a redo of my keynote address at the 2017 CISSE conference held in June, 2017 in Las Vegas. The talk specifically addresses what I see as the needs in current information security education. CISSE was unable to record it at the time, so I redid it for posterity based on the speaker notes. It only runs about 35 minutes long (there were no introductions or Q&A to field) so it is a quicker watch than being at the conference!

I think there are some other goodies in all of those videos, including views of my bow ties over the years, plus some of my predictions (most of which seem to have been pretty good). However, I am putting these out without having carefully reviewed them — there may be some embarrassing goofs among the (few) pearls of wisdom. It is almost certain that many things changed away from the operational environment that existed at the time I gave some of these talks, so I’m sure some comments will appear “quaint” in retrospect. However, I decided that I would share what I could because someone, somewhere, might find these of value.

If you know of a recording I don’t have linked in to one of the lists, please let me know.

Comments appreciated. Give it a look!

How far do warrants reach in “The Cloud”?

Share:

There is a case currently (early 2018) pending before the Supreme Court of the United States (SCOTUS) addressing if/how a US warrant applies to data held in a cloud service outside the US but run by a US entity.

The case is United States vs. Microsoft, and is related to interpretation of 18 U.S.C. § 2703 — part of the Stored Communications Act.

The case originated when US authorities attempted to serve a warrant on Microsoft to retrieve email of a user whose email was serviced by MS cloud servers in Ireland. Microsoft asserted the data resided in Ireland and the US warrant did not extend outside the US. The US contends that the warrant can be fully served inside the US by Microsoft and no foreign location is involved. Microsoft sued to vacate. The district court upheld the government, and found Microsoft in contempt for not complying. On appeal, the 2nd Circuit Court of Appeals overturned that decision (and the contempt citation), and remanded the case for reconsideration. The US government sought and obtained a writ of certiorari (basically, sought a hearing before SCOTUS to consider that appellate ruling). The oral arguments will be heard the last week in February.

The decision in the case has some far-reaching consequences, not least of which is that if the warrant is allowed, it is likely to drive business away from US service providers of cloud services — clients outside the US will be concerned that the US could compel production of their data. At the same time, if the warrant is not allowed, it could mean that service providers could spring up serving data out of one or more locations that routinely ignore US attempts to cooperate on computer crime/terrorism investigations. (Think of the cloud equivalent of banking havens such as the Caymen Islands, Vanuatu, and the Seychelles.) Neither result is particular appealing, but it seems (to me) that under current law the warrant cannot be enforced.

I signed on to an amicus (friend of the court) brief, along with 50 other computing faculty. Our brief is not explicitly in favor of either side in the dispute, but is intended to help educate the court about how cloud services operate, and that data does actually have a physical location.

If you are interested in reading the other briefs — including several from other amici ("friends of the court”) there are links from the SCOTUS blog about the case. It is interesting to note the perspectives of the EU and Irish governments, trade associations, former law enforcement and government officials, and more. The general consensus of the ones I read seems to me to favor Microsoft in this case. We shall have to see if the SCOTUS agrees, and whether Congress then acts to set new law in the area, if so.

This case is an example of one of the difficulties when we have few barriers in network communications, and the data flows across political borders. It is, in some sense, analogous to the “going dark” concerns of the FBI. How do we maintain privacy in an arena where bad actors use the technology to “hide” what they do, potentially forever beyond reach of law enforcement? Furthermore, how do we enforce the rules of law in an environment where some of the legal authorities are ideologically opposed to privacy rights or rule of law as envisioned by other authorities? It is also related to searches of computing devices carried across borders (including cell phones), and similar instances where the attempt has been made to equate the presence of end points or corporate operators as somehow including the data accessible via those end points. All of these are problems that the technology aggravates but are unlikely — if not impossible — to solve by technology alone.

Interesting times, no matter which side of these matters one is normally likely to support.

This is the 3rd amicus brief before the SCOTUS to which I have been a signatory, and one of 10 overalll. This is very different from publishing academic papers!)

Another good one gone too soon

Share:

Today, I attended the funeral in Illinois of another good friend in infosec: Ken Olthoff. Ken was my friend for over 25 years, and his death was a surprise to me and to everyone who knew him. It was also a significant loss to the field, and another sad reminder that each of us needs to live our goals sooner rather than later. The funeral included a great set of remembrances of some aspects of Ken’s life and contributions, with the service conducted by his cousin, Pastor Diane Maodush-Pitzer.

Kenneth George Olthoff was born November 18, 1959. He grew up outside Chicago in Thornton, and received a degree from Purdue Calumet. His family remembers him exhibiting, at a young age, great curiosity about how things worked and clear engineering aptitude. Around three decades ago, he joined the NSA, where he worked until his untimely passing.

ken-olthoff-040511.jpg

Ken was on leave to visit family in Illinois in early October, as he did twice each year. Along with visiting his relatives, he engaged in some repairs to his childhood home — where he planned to retire in a small number of years, using it as a base for travel. On this most recent visit, he worked his way through his “to do” list, with the last being his annual long distance bike ride of 60+ miles (Ken did a lot of recumbent bicycling all year round). He then had dinner with his brother, Jack, and sister-in-law, Sue. Jack tried to reach him by phone Sunday, October 15, and when he did not get an answer, Jack went to check on him. Jack found Ken sitting in a recliner, in front of the TV. He had died, peacefully, during the night. The medical examiner listed cause of death as cardiaovascular-related. Ken would have been 58 next month.

Ken had many “families” in which he was connected. I think Vonnegut’s concept of the “karass” may be more a more accurate characterization. Ken had a wide-ranging curiosity and set of interests that created bridges to all sorts of people. Notably, Ken was a hardworking, creative, and valued contributor to national information security solutions. He wasn’t always acknowledged (or even known outside where he worked) for what he did, but many of the people who worked with him treasured his positive contributions. Ken’s commitment to “speak truth to power” sometimes grated on a few, but more often was valued within a community that sometimes has been too quick to buy into the “emperor’s new wardrobe.” I know a little of what Ken did at the Agency, and I have heard from others who knew his work better than I do (because some of it was classified and on a need-to-know basis); more than one of these people have commented that there were many in military service who made it home — alive, to their families — because of things Ken designed or built.

Ken was notable in the broader cybersecurity community, too, although not as well-known as many others. Whether it was as the first person ever identified in the “Spot-the-Fed” at DefCon, or writing outrageous plays about security foibles for performance at NSPW, or any of a number of other activities, Ken also had many admirers and friends outside of where he worked.

Ken was also, in the words of a friend, “… an avid disc golfer and recumbent bike rider, collector of Japanese prints and wood turnings, fan of authentic ethnic cuisines, aficionado of the Chicago music scene (particularly loyal to Pezband), fan and supporter of dirt track racing and youth hockey, and patron and production crew member for Charm City Roller Girls, and the AXIS Theatre and Rapid Lemon Productions companies in Baltimore.” He ran several mailing lists on these topics (and more), with eclectic and interesting memberships that evidenced a broad set of interests beyond even these. I learned today that he held at least five patents, on topics ranging from cyber security mechanisms to accessories for musical instruments!

Anyone who knew Ken also remembers his amazing sense of humor (and/or puns), his humility, his generosity, and his (frequent) lack of awareness of pop culture items. Ken was too busy living life to be a regular on social media!

Ken had posted at some point on his LinkedIn profile: “Goals: make positive use of the skills I have, save lives, leave the world a slightly better place than I found it, be a loyal friend, be honest, live my life in a way that gives others something to be thankful for.” I think those of us who knew him will agree that he lived those goals, achieved all of them, and often exceeded them. I am sad that I didn’t have an opportunity to tell him roughly that — I had resolved to do so after our mutual close friend, Becky Bace (who introduced Ken to me), died suddenly earlier this year, but our schedules did not align soon enough.

You are invited to visit Ken's (brief) online obituary and guest book. The family has indicated that memorial contributions may be given to the American Heart Association.

(I hope the rest of the infosec community remains hale and hearty for a while — we’ve had too many losses recently.)




[I will add to this post if I get other information. In particular, I hope to be able to provide links to his NPSW plays.]

A Blast from the Past

Share:

In December of 1988, I was invited to speak at Bell Communications Research (Bellcore) about the Morris Internet Worm that had been released about six weeks before. The invitation was to speak on computer security in general, malicious software more specifically, and particularly “The Worm."

At the time, I was a new assistant professor — I had joined the faculty at Purdue in August of 1987. This was only my second ever presentation on computing security issues, although I had been working in the area for years. Note, that this was well before I had coauthored either the Computer Virus book or Practical Unix Security.

The title of the talk was Worms, Viruses, and Other Programmed Pests. I went on to give a variant of this presentation about 2 dozen times in the year following this talk.

I had forgotten that I had a copy of this video stored away. I’m sharing it now for historical purposes (and for some of my friends, hysterical purposes).

I think that this talk has aged very well, considering it was given nearly 30 years ago. Most of what I talk about here (but not all) is still relevant. Clearly, a number of the examples and numbers have changed drastically since then, but some of the most significant aspects have remained unchanged. Much of the advice I gave then could be given today because it still applies….and still is largely ignored. Especially, check out the Q&A at the end.

You can tell this video is really old, not only because of the video artifacts, but because:

  1. I am wearing a normal tie (I switched to bow ties exclusively a few years later)
  2. I am making the presentation using acetates instead of from a computer
  3. I have almost a full head of hair
  4. I only had a waistline in double digits.

You'll also note that I had the odd sense of humor even then. Oh, and I used the Oxford comma in the title.

Enjoy.


(Direct link to YouTube page here.)

Purdue CERIAS Researchers Find Vulnerability in Google Protocol

Share:

[This is posted on behalf of the three students listed below. This is yet another example of bad results when speed takes precedence over doing things safely. Good work by the students! --spaf]




As a part of an INSuRE project at Purdue University, PhD Information Security student Robert Morton and seniors in Computer Science Austin Klasa and< Daniel Sokoler conducted an observational study on Google’s QUIC protocol (Quick UDP Internet Connections, pronounced quick). The team found that QUIC leaked the length of the password potentially allowing eavesdroppers to bypass authentication in popular services like Google Mail or G-mail. The team named the vulnerability Ring-Road and is currently trying to quantify the potential damage.

During the initial stages of the research, the Purdue team found that the Internet has been transformed over the last five years with a new suite of performance improving communication protocols such as SPDY, HTTP/2 and QUIC. These new protocols are being rapidly adopted to increase the speed and performance of applications on the Internet. More than 10% of the top 1 Million websites are already using some of these technologies, including many of the 10 highest traffic sites.

While these new protocols have improved speed, the Purdue team focused on determining if any major security issues arose from using QUIC. The team was astonished to find that Google's QUIC protocol leaks the exact length of sensitive information when transmitted over the Internet. This could allow an eavesdropper to learn the exact length of someone's password when signing into a website. In part, this negates the purpose of the underlying encryption, which is designed to keep data confidential -- including its length.

In practice, the Purdue team found QUIC leaks the exact length of passwords into commonly used services such as Google's E-mail or G-mail. The Purdue team than created a proof-of concept exploit to demonstrate the potential damage:

Step 1 - The team sniffed a target network to identify the password length from QUIC.

Step 2 - The team optimized a password dictionary to the identified password length.

Step 3 - The team automated an online attack to bypass authentication into G-mail.

The Purdue team believes the root cause of this problem came when Google decided to use a particular encryption method in QUIC: the Advanced Encryption Standard Galois/Counter Mode (AES-GCM). AES-GCM is a mode of encryption often adopted for its speed and performance. By default, AES-GCM cipher text is the same length as the original plaintext. For short communications such as passwords, exposing the length can be damaging when combined with other contextual clues to bypass authentication, and therein lies the problem.

Conclusion

In summary, there seems to be an inherent trade-off between speed and security. As new protocols emerge on the Internet, these new technologies should be thoroughly tested for security vulnerabilities in a real-world environment. Google has been informed of this vulnerability and is currently working to identify a patch to protect their users. As Google works to create a fix, we recommend users and system administrators to disable QUIC in Chrome and their servers by visiting this link. We also recommend -- independent of this issue -- that users consider enabling two step verification with their G-mail accounts, for added protection, as described here. The Purdue team will be presenting their talk and proof-of-concept exploit against G-mail at the upcoming CERIAS Symposium on 18 April 2017.

Additional Information

To learn more, please visit ringroadbug.com and check out the video of our talk called, "Making the Internet Fast Again...At the Cost of Security" at the CERIAS Symposium on 18 April 2017.

Acknowledgements

This research is a part of the Information Security Research and Education (INSuRE) project. The project was under the direction of Dr. Melissa Dark and Dr. John Springer and assisted by technical directors a part of the Information Assurance Directorate of the National Security Agency.

INSuRE is a partnership between successful and mature Centers of Academic Excellence in Information Assurance Research (CAE-R) and the National Security Agency (NSA), the Department of Homeland Security and other federal and state agencies and laboratories to design, develop and test a cybersecurity research network. INSuRE is a self-organizing, cooperative, multi-disciplinary, multi-institutional, and multi-level collaborative research project that can includes both unclassified and classified research problems in cybersecurity.

This work was funded under NSF grant award No. 1344369. Robert Morton, the PhD Information Security student, is supported under the Scholarship For Service (SFS) Fellowship NSF grant award No. 1027493.

Disclaimers

Any opinions, findings, or conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation, CERIAS, Purdue University, or the National Security Agency.