Proposed Changes in Export Control
The U.S. limits the export of certain high-tech items that might be used inappropriately (from the government’s point of view). This is intended to prevent (or slow) the spread of technologies that could be used in weapons, used in hostile intelligence operations, or used against a population in violation of their rights. Some are obvious, such as nuclear weapons technology and armor piercing shells. Others are clear after some thought, such as missile guidance software and hardware, and stealth coatings. Some are not immediately clear at all, and may have some benign civilian uses too, such as supercomputers, some lasers, and certain kinds of metal alloys.
Recently, there have been some proposed changes to the export regulations for some computing-related items. In what follows, I will provide my best understanding of both the regulations and the proposed changes. This was produced with the help of one of the professional staff at Purdue who works in this area, and also a few people in USACM who provided comments (I haven’t gotten permission to use their names, so they’re anonymous for now). I am not an expert in this area so please do not use this to make important decisions about what is covered or what you can send outside the country! If you see something in what follows that is in error, please let me know so I can correct it. If you think you might have an export issue under this, consult with an appropriate subject matter expert.
Export Control
Some export restrictions are determined, in a general way, as part of treaties (e.g., nuclear non-proliferation). A large number are as part of the Wassenaar Arrangement — a multinational effort by 41 countries generally considered to be allies of the US, including most of NATO; a few major countries such as China are not, nor are nominal allies such as Pakistan and Saudi Arabia (to name a few). The Wassenaar group meets regularly to review technology and determine restrictions, and it is up to the member states to pass rules or legislation for themselves. The intent is to help promote international stability and safety, although countries not within Wassenaar might not view it that way.
In the U.S., there are two primary regulatory regimes for exports: ITAR and EAR. ITAR is the International Traffic in Arms Regulations in the Directorate of Defense Trade Controls at the Department of State. ITAR provides restrictions on sale and export of items of primary (or sole) use in military and intelligence operations. The EAR is the Export Administration Regulations in the Bureau of Industry and Security at the Department of Commerce. EAR rules generally cover items that have “dual use” — both military and civilian uses.
These are extremely large, dense, and difficult to understand sets of rules. I had one friend label these as “clear as mud.” After going through them for many hours, I am convinced that mud is clearer!
Items designed explicitly for civilian applications without consideration to military use, or with known dual-use characteristics, are not subject to the ITAR because dual-use and commodity items are explicitly exempted from ITAR rules (see sections 121.1(d) and 120.41(b) of the ITAR). However, being exempt from ITAR does not make an item exempt from the EAR!
If any entity in the US — company, university, or individual — wishes to export an item that is covered under one of these two regimes, that entity must obtain an export license from the appropriate office. The license will specify what can be exported, to what countries, and when. Any export of a controlled item without a license is a violation of Federal law, with potentially severe consequences. What constitutes an export is broader than some people may realize, including:
- Shipping something outside the U.S. as a sale or gift is an export, even if only a single instance is sent.
- Sending something outside the U.S. under license, knowing (or suspecting) it will then be sent to a 3rd location is a violation.
- Providing a controlled item to a foreign-controlled company or organization even if in the U.S. may be an export.
- Providing keys or passwords that would allow transfer of controlled information or materials to a foreign national is an export
- Designing or including a controlled item in something that is not controlled, or which separately has a license, and exporting that may be a violation.
- Giving a non-U.S. person (someone not a citizen or permanent resident) access to an item to examine or use may be an export.
- Providing software, drawings, pictures, or data on the Internet, on a DVD, on a USB stick, etc to a non-U.S. person may be an export
Those last two items are what as known as a deemed export because the item didn’t leave the U.S., but information about it is given to a non-US person. There are many other special cases of export, and nuances (giving control of a spacecraft to a foreign national is prohibited, for example, as are certain forms of reexport). This is all separate from disclosure of classified materials, although if you really want trouble, you can do both at the same time!
This whole export thing may seem a bit extreme or silly, especially when you look at the items involved, but it isn’t — economic and military espionage to get this kind of material and information is a big problem, even at research labs and universities. Countries that don’t have the latest and greatest factories, labs, and know-how are at a disadvantage both militarily and economically. For instance, a country (.e.g, Iran) that doesn’t have advanced metallurgy and machining may not be able to make specialized items (e.g., the centrifuges to separate fissionable uranium), so they will attempt to steal or smuggle the technology they need. The next best approach is to get whatever knowledge is needed to recreate the expertise locally. You only need to look at the news over a period of a few months to see many stories of economic theft and espionage, as well as state-sponsored incidents.
This brings us to the computing side of things. High speed computers, advanced software packages, cryptography, and other items all have benign commercial uses. However, they all also have military and intelligence uses. High speed computers can be used in weapons guidance and design systems, advanced software packages can be used to model and refine nuclear weapons and stealth vehicles, and cryptography can be used to hide communications and data. As such, there are EAR restriction on many of these items. However, because the technology is so ubiquitous and the economic benefit to the U.S. is so significant, the restrictions have been fairly reasonable to date for most items.
Exemptions
Software is a particularly unusual item to regulate. The norm in the community (for much of the world) is to share algorithms and software. By its nature, huge amounts of software can be copied onto small artifacts and taken across the border, or published on an Internet archive. In universities we regularly give students from around the world access to advanced software, and we teach software engineering and cryptography in our classes. Restriction on these kinds of items would be difficult to enforce, and in some cases, simply silly to restrict.
Thus, the BIS export rules contain a number of exemptions that remove some items from control entirely. (In the following, designations such as 5D002 refer to classes of items as specified in the EAR, and 734.8 refers to section 734 paragraph 8.)
- EAR 734.3(b.3) exempts technology except software classified under 5D002 (primarily cryptography) if it is
-
- arising from fundamental research (described in 734.8), or
- is already published or will be published (described in 734.7), or
- is educational information (described in 734.9).
- EAR 734.7 defines publication as appearing in books, journals, or any media that is available for free or at a price not to exceed the cost of distribution; or freely available in libraries; or released at an open gathering conference, meeting, or seminar open to the qualified public; or otherwise made available,
- EAR 734.8 defines fundamental research that is ordinarily published in the course of that research.
- EAR 734.9 defines educational information (which is excluded from the EAR) as information that is released by instruction in catalog courses and associated teaching laboratories of academic institutions. This provision applies to all information, software or technology except certain encryption software, but if the source code of encryption software is publicly available as described in 740.13(e), it can also be considered educational information.
- We still have some deemed export issues if we are doing research that doesn’t meet the definition of fundamental research (e.g. because it requires signing an NDA or there is a publication restriction in the agreement) and a researcher or recipient involved is not a US person (citizen or permanent resident) employed full time by the university, and with a permanent residence in the US, and is not a national of a D:5 country (Afghanistan, Belarus, Burma, the CAR, China (PRC), the DRC, Core d’Ivorie, Cuba, Cyprus, Eritrea, Fiji (!), Haiti, Iran, Iraq, DPRK, Lebanon, Liberia, Libya, Somalia, Sri Lanka, Sudan, Syria, Venezuela, Vietnam, Zimbabwe). However, that is easily flagged by contracts officers and should be the norm at most universities or large institutions with a contracts office.
- EAR 740.13(d) exempts certain mass-market software that is sold retail and installed by end-users so long as it does not contain cryptography with keys longer than 64 bits.
The exemption for publication is interesting. Anyone doing research on controlled items appears to have an exemption under EAR 740.13(e) where they can publish (including posting on the Internet) the source code from research that falls under ECCN 5D002 (typically, cryptography) without restriction, but must notify BIS and NSA of digital publication (email is fine, see 740.13(e.3)); there is no restriction or notification requirement for non-digital print. What is not included is any publication or export (including deemed export) of cryptographic devices or object code not otherwise exempt (object code whose corresponding source code is exempt is itself exempt), or for knowing export to one of the prohibited countries (E:1 from supplement 1 of section 740 — Cuba, Iran, DPRK, Sudan and Syria, although Cuba may have just been removed.)
As part of an effort to harmonize the EAR and ITAR, a proposed revision to both has been published on June 3 (80 FR 31505) that has a nice side-by-side chart of some of these exemptions, along with some small suggested changes.
Changes
The Wassenaar group agreed to some changes in December 2013 to include intrusion software and network monitoring items of certain kinds on their export control lists. The E.U. adopted new rules in support of this in October of 2014. On May 20, 2015, the Department of Commerce published — in the Federal Register (80 FR 28853) — a request for comments on its proposed rule to amend the EAR. Specifically, the notice stated:
The Bureau of Industry and Security (BIS) proposes to implement the agreements by the Wassenaar Arrangement (WA) at the Plenary meeting in December 2013 with regard to systems, equipment or components specially designed for the generation, operation or delivery of, or communication with, intrusion software; software specially designed or modified for the development or production of such systems, equipment or components; software specially designed for the generation, operation or delivery of, or communication with, intrusion software; technology required for the development of intrusion software; Internet Protocol (IP) network communications surveillance systems or equipment and test, inspection, production equipment, specially designed components therefor, and development and production software and technology therefor. BIS proposes a license requirement for the export, reexport, or transfer (in-country) of these cybersecurity items to all destinations, except Canada. Although these cybersecurity capabilities were not previously designated for export control, many of these items have been controlled for their "information security" functionality, including encryption and cryptanalysis. This rule thus continues applicable Encryption Items (EI) registration and review requirements, while setting forth proposed license review policies and special submission requirements to address the new cybersecurity controls, including submission of a letter of explanation with regard to the technical capabilities of the cybersecurity items. BIS also proposes to add the definition of "intrusion software" to the definition section of the EAR pursuant to the WA 2013 agreements. The comments are due Monday, July 20, 2015.
The actual modifications are considerably more involved than the above paragraph, and you should read the Federal Register notice to see the details.
This proposed change has caused some concern in the computing community, perhaps because the EAR and ITAR are so difficult to understand, and because of the recent pronouncements by the FBI seeking to mandate “back doors” into communications and computing.
The genesis of the proposed changes is stated to match the Wassenaar additions of (basically) methods of building, controlling, and inserting intrusion software; technologies related to the production of intrusion software; technology for IP network analysis and surveillance, or for the development and testing of same. These are changes to support national security, regional stability, and counter terrorism.
According to the notice, intrusion software includes items that are intended to avoid detection or defeat countermeasures such as address randomization and sandboxing, and exfiltrate data or change execution paths to provide for execution of externally provided instructions. Debuggers, hypervisors, reverse engineering, and other software tools are exempted. Software and technology designed or specially modified for the development, generation, operation, delivery, or communication with intrusion software is controlled — not the intrusion software itself. It is explicitly stated that rootkits and zero-day exploits will presumptively be denied licenses for export.
The proposed changes for networking equipment/systems would require that it have all 5 of the following characteristics to become a controlled item:
- It operates on a carrier class IP network (e.g., national grade backbone)
- Performs analysis at OSI layer 7
- Extracts metadata and content and indexes what it extracts
- Executes searches based on hard selectors (e.g., name, address)
- Performs mapping of relational networks among people or groups
Equipment specially designed for QoS, QoE, or marketing is exempt from this classification.
Two other proposed changes would remove the 740.13(d) exemption for mass-market products, and would make software controlled by one of these new sections and containing encryption would now be dual-listed in two categories. There are other changes for wording, cleaning up typos, and so on.
I don’t believe there are corresponding changes to ITAR because these all naturally fall under the EAR.
Discussion
Although social media has had a number of people posting vitriol and warnings of the impending Apocalypse, I can’t see it in this change. If anything, this could be a good thing — people who are distributing tools to build viruses, botnets, rootkits and the like may now be prosecuted. The firms selling network monitoring equipment that is being used to oppress political and religious protesters in various countries may now be restrained. The changes won’t harm legitimate research and teaching, because the exemptions I listed above will still apply in those cases. There are no new restrictions on defensive tools. There are no new restrictions on cryptography.
Companies and individuals making software or hardware that will fall under these new rules will now have to go through the process of applying for export licenses, It may also be the case those entities may find their markets reduced. I suspect that it is a small population that will be subject to such a restriction, and for some of them, given their histories, I’m not at all bothered by the idea.
I have seen some analyses that claim that software to jailbreak cellphones might now be controlled. However, if published online without fee (as is often the case), it would be exempt under 734.7. It arguably is a debugging tool, which is also exempt.
I have also seen claims that any technology for patching would fall under these new rules. Legitimate patching doesn’t seek to avoid detection or defeat countermeasures, which are specifically defined as “techniques designed to ensure the safe exertion of code.” Thus, legitimate patching won’t fall within the scope of control.
Jennifer Granick wrote a nice post about the changes. She rhetorically asked at the end whether data loss prevention tools would fall under this new rule. I don’t see how — those tools don’t operate on national grade backbones or index the data they extract. She also posed a question about whether this might hinder research into security vulnerabilities. Given that fundamental research is still exempt under 734.8 as are published results under 734.7, I don’t see the same worry.
The EFF also posted about the proposed rule changes, with some strong statements against them. Again, the concern they stated is about researchers and the tools they use. As I read the EAR and the proposed rule, this is not an issue if the source code for any tools that are exported is published, as per 734.7. The only concern would if the tools were exported and the source code was not publicly available, i.e., private tools exported. I have no idea how often this happens; in my experience, either the tools are published or else they aren’t shared at all, and neither case runs afoul of the rule. The EFF post also tosses up fuzzing, vulnerability reporting, and jailbreaking as possible problems. Fuzzing tools might possibly be a problem under a strict interpretation of the rules, but the research and publication exemptions would seem to come to the rescue. Jailbreaing I addressed, above. I don’t see how reporting vulnerabilities would be export of technology or software for building or controlling intrusion software, so maybe I don’t understand the point.
At first I was concerned about how this rule might affect research at the university, or the work at companies I know about. As I have gotten further into it, I am less and less worried. it seems that there are very reasonable exceptions in place, and I have yet to see a good example of something we might legitimately want to do that would now be prohibited under these rules.
However, your own reading of the proposed rule changes may differ from mine. If so, note the difference in comment to this essay and I’ll either respond privately or post your comment. Of course, commenting here won’t affect the rule! If you want to do that, you should use the formal comment mechanism listed in the Federal Register notice, on or before July 20, 2015.
Update July 17: The BIS has an (evolving) FAQ on the changes posted online. It makes clear the exemptions I described, above. The regulations only cover tools specially designed to design, install, or communicate with intrusion software as they define it. Sharing of vulnerabilities and proof of exploits is not regulated. Disclosing vulnerabilities is not regulated so long as the sharing does not include tools or technologies to install or operate the exploits.
As per the two blog posts I cite above
- research into security vulnerabilities is explicitly exempt so long as it is simply the research
- export of vulnerability toolkits and intrusion software would be regulated if those tools are not public domain
- fuzzing is explicitly listed as exempt because it is not specifically for building intrusion software
- jailbreaking is exempt, as is publicly available tools for jailbreaking. Tools to make jrailbreaks would likely be regulated.
Look at the FAQ for more detail.
Comments