The security industry files formal objections to Wassenaar proposal

Export restrictions classifying certain cybersecurity technology as arms caused industry outrage.

Wolfgang Rattay / Reuters

July 22, 2015

The comment period is over

Companies, security pros, and human rights advocates had until midnight Tuesday to weigh in on a controversial proposal meant to limit US exports of surveillance technology.  

The intention behind the federal government’s proposed implementation of the Wassenaar Arrangement, a 41-country arms control pact, was to limit the overseas sale of spyware – especially to oppressive regimes.

But many experts objected, including a slew of American companies – from tech giants such as Cisco to cybersecurity firm Symantec – voiced loud objections. The overly broad bans on certain exports described in the proposal, they said, could hinder legitimate analysis of computer security weaknesses. Human and digital rights organizations such as the Electronic Frontier Foundation worried the rules could have a chilling effect on security research. 

In the race to attract students, historically Black colleges sprint out front

Commenters, both pro and con, had 60 days to submit comments to the Department of Commerce’s Bureau of Industry and Security. Passcode compiled excerpts of the formal comments submitted by key players in the debate. 

Cisco and Symantec

Eric Wenger, director of global government affairs, Cisco Systems:

“We understand the importance of the government’s concerns regarding the unregulated export of weaponized software. However, many of the same techniques used by the attackers are important to developers testing their defenses and developing new effective responses. Cisco needs access to the very tools and techniques that attackers use if we have any hope of maintaining the security of our products and services throughout their anticipated lifecycles. The development of new export control requirements must, therefore, be done carefully and based upon the needs of legitimate security researchers. Otherwise, we will leave network operators blind to the attacks that may be circulating in the criminal underground — and ultimately blind to the very weaponized software that the proposed rule intends to constrain.”

Cheri McGuire, vice president of global government affairs and cybersecurity policy, Symantec:

“While we recognize that the rule was intended to protect national security interests and preserve human rights, it misses the mark. It presupposes that systems, equipment, components and software that are specially designed or modified for the generation, operation, or delivery of, or communication with, ‘intrusion software’ are ‘hacking’ tools that are all used for nefarious purposes. This is not the case. In fact, Symantec – and virtually every other legitimate security company – uses such tools to ensure the security of our networks and commercial products.”

Moody chickens? Playful bumblebees? Science decodes the rich inner lives of animals.

Members of Congress

Rep. James Langevin (D) of Rhode Island, Rep. David Schweikert (R) of Arizona, Rep. Michael McCaul (R) of Texas, and Rep. Ted Lieu (D) of California:

"We are ... troubled by the implications of applying the 'deemed export' regime to intrusion software, a rule that has not been adopted by European Union members in implementing the Wassenaar Arrangement. Many American companies have multinational footprints, and even those solely operating within the United States often employ foreign nationals, particularly in fields like cybersecurity that suffer from an acute talent deficit. Similarly, academic institutions around the country have a significant minority of foreign graduate students, many of whom are at the front lines of information security research.

We see two significant challenges in applying the deemed export rules to these technologies. Third parties often disclose vulnerabilities to anonymous email addresses established specifically for this purpose. A security researcher thus has no way of knowing who precisely will see the disclosure. Requiring a careful chain of custody for researchers to ensure they don’t inadvertently ‘export’ a vulnerability by sharing it with foreign national employed by a developer could easily disrupt the entire reporting ecosystem.

Furthermore, if companies or researchers are required to segregate data based on nationality or to apply for a license to share information with their own students or employees, research will suffer. Companies may be unable to share threat data with their own international affiliates, at least not in a timely manner. Because hackers can attack overseas just as easily as domestically, any weak system with access to a business’s internal network represents a serious vulnerability.

As you are no doubt aware, Congress is very interested in expanding information sharing of cyber threat indicators. Two bills have already passed the House this session attempting to incentivize information sharing, and the Senate Select Committee on Intelligence has favorably reported a similar measure. We hope that the final BIS rule will further these efforts, or at the very least not hinder them.”

Human and digital rights groups

Access, the Center for Democracy and Technology, Electronic Frontier Foundation, Human Rights Watch, and New America’s Open Technology Institute, and security researcher Collin Anderson explain why they would prefer the US regulate exports that are actually intended to be used as surveillance software: 

“The challenges of establishing a strictly technical definition of problematic transfers were made more clear by the product information and communications between [spyware vendor] Hacking Team and exploit brokers, such as VUPEN and Netragard. These emails clearly demonstrate a private market for the sale of exploits to the highest bidder; however, the information shared with Hacking Team for marketing and sales of vulnerabilities by these vendors are not substantially different from the information disclosed in a critical vulnerability (CVE) report [issued to consumers on vulnerabilities that need to be patched].  

Beyond the inherent risks of overbreadth, the pursuit of a strictly technical line of difference between security research and exports of concern will limit the ultimate effectiveness of such control. The primary value provided by exploit brokers is information on the nature of a vulnerability – the proof of concept that BIS has repeatedly asserted is not controlled. In very few circumstances is more required to understand and replicate an attack than access to a proof of concept or working exploit. A proof of concept “shellcode” can be replaced by functional “shellcode” for the compromise of the device. Permitting release of proof of concepts while controlling technical data on exploit techniques becomes a futile endeavor, as it will be easy to discern mechanisms from source code or decompiled binaries. In fact, much of the learning process occurs from reverse engineering of exploits found in the wild, even in the case of sophisticated Intrusion Software built by state actors.”

The Electronic Frontier Foundation

“As it revises the Proposed Rule, we urge BIS to also revise the Preamble to present a clear statement on the intended scope of the regulation....

From the perspective of academics, security researchers, and open­-source developers, it would be better to be faced with a clearly-­worded, clearly­-defined rule that the community did not necessarily agree with, than a difficult to understand rule that seemed to implement policies that the community would support, if only it could only understand what the rule meant....

The vagueness of the WA control lists has real world chilling effects on fundamental academic research. Take for example the dissertation of a student at the University of Northumbria named Grant Wilcox. EFF does not believe that censorship of Mr. Wilcox’s paper 6 required by the WA control lists. However, the fact of the matter is that Mr. Wilcox’s university ethics board did censor the dissertation, believing it to be possibly within the WA definitions. This instance is only one recent and particularly clear example among many of the unintended chilling effects of vaguely worded regulation. From an EFF perspective, examples such as the needless censorship of Mr. Wilcox’s dissertation strongly caution against proceeding with an implementation of the WA in the United States without first clarifying the scope of what exactly the rules are intended to control.”

Immunity Inc. and Rapid7

Dave Aitel, CEO of Immunity Inc., which makes penetration testing products used to test networks for vulnerabilities that would be subject to licensing rules:

“Penetration testing, at its root and heart, is about discovering a ground truth. You say you have Anti-Virus, but is it really working? Your intrusion detection system is set up to scan for bad emails, but I just sent some, so did it alert you? I am exfiltrating your database from your network - is your IT team able to see that? 

This ability to discover a ground truth is a vital part of building a company’s security defenses. This is why it is required by PCI [Payment Card Industry Data], HIPAA [Health Insurance Portability and Accountability Act], various NIST [National Institute of Standards and Technology] standards, and of course, any competent security team’s daily efforts. They require their toolsets and consultants to emulate all manors of threats, in order to determine if their defenses are adequate, from low-level hackers, to [advanced persistent threat]-grade teams.”

Rapid7, which makes the penetration testing tool Metasploit:

“The proposed rule would place significant restrictions on exports, reexports, and transfers of penetration test platforms, and would not distinguish between products that possess characteristics and features that deter misuse, and those that do not. For example, as noted above, Metasploit products only incorporate attack methods that are publicly available. In addition, the proprietary editions of Metasploit have a number of built-in safety features intended to ensure that the product is used only for security enhancement purposes....

Safeguards make [Metasploit] less attractive to those who would use the product for malicious purposes. By not distinguishing between products that deter misuse and those that do not, BIS is missing an opportunity to encourage developers of controlled products to incorporate features to maximize accountability and to ensure that their products are used only in legitimate security contexts. We believe that BIS should create a list of features (like those in Metasploit and other products) that, if present, would exempt a product from the proposed controls on intrusion platforms altogether, render the product eligible for favorable license exceptions, or subject the product to less stringent export licensing requirements.”

Drawbridge Networks

Tom Cross, former engineering adviser to the export compliance program at Internet Security Systems (now a part of IBM), and current chief technology officer at Drawbridge Networks:

“BIS has responded to several questions regarding the disclosure of information about vulnerabilities to software vendors and security software companies by explaining that [only] information which is being prepared for public disclosure is not controlled....

It is important for BIS to understand that often, detailed technical information that is provided as a part of a vulnerability disclosure is never shared with the public, that this detailed technical information often includes specific categories of information that BIS says will be controlled under the proposed rule, and that premature public disclosure of this information can and does fuel criminal activity.”

Chris Soghoian

Chris Soghoian, the principle technologist of the American Civil Liberties Union, filed his own opinion – not on the behalf of the ACLU: 

“Although I have reservations about the specific draft text that BIS has proposed, I generally support the goal of regulating the export of surveillance technologies. Indeed, I encourage BIS to regulate the export of intrusion and remote monitoring software to governments, such as products similar those sold by FinFisher and Hacking Team.

I understand that under the current BIS proposal, export controls ‘would not apply to intrusion software itself (e.g., exploits, rootkits, backdoors, viruses, other malicious code).’ I urge BIS to expand the list of controlled technologies, beyond the categories required by Wassenaar.  Specifically, I urge BIS to also regulate the export of security exploits to governments which are explicitly marketed for surveillance purposes, such as products similar to those sold by VUPEN.

I urge BIS to focus its regulations on companies that sell surveillance technologies to governments. BIS should consider both how these surveillance products are marketed, and the end-users to whom they are sold. BIS should adopt a policy of presumptive denial for ‘lawful interception’ products and other surveillance technologies marketed and sold to governments.

I also urge BIS to impose ‘Know Your Customer’ obligations on exporters of so-called ‘lawful surveillance’ technologies, similar to those that already exist for nuclear, chemical, and biological weapon end uses. Given the ease with which products like these can be resold through brokers, export control regulations will only be effective if the manufacturers and exporters of this technology are required to identify the ultimate end-user.”