At long last, and well after the E.U. and many other members of the Wassenaar Arrangement, BIS has released proposed (but not final) rules implementing the December 2013 changes adopted by the Arrangement and which imposed export controls on “intrusion detection software” and “IP network communications surveillance” systems and equipment. After the E.U. adopted the 2013 changes in October 2014, we speculated that the delay by BIS beyond its announced September 2014 date for releasing a proposed rule was that it perhaps was struggling with the impact of Wassenaar’s overbroad definition of “intrusion detection software.” But we were wrong.
The proposed rule adopts the Wassenaar changes without clarification of the scope of coverage of intrusion detection software. Instead, the delay seems to have been wholly occasioned by housekeeping matters: specifying the reasons for control, deciding that no license exceptions would apply, and so forth. The proposed BIS rules also grapple with a rather esoteric problem: what to do with intrusion detection software with encryption functionality. And it decides that the software is classified, and must comply with, both ECCNs, which, at last, concedes something BIS long said was impossible: that an item could have two ECCNs. Finally, and I’m not joking, so I’ll quote the agency itself to prove that I’m not
[a] reference to §772.1 is proposed to be added to ECCNs 4A005, 4D001 and 4E001 to point to the location of the ‘‘intrusion software’’ definition, as this rule may be of interest to many new exporters that would not otherwise know that double quoted terms in the EAR are defined in §772.1.
Seriously? Now BIS starts to worry about the indecipherability of the EAR and the secret rules of interpretation that must be applied? What next? Will proposed rules start spelling out “n.e.s.”?
But, all joking aside, the problems with the definition of intrusion software remain
‘‘Software’’ ‘‘specially designed’’ or modified to avoid detection by ‘monitoring tools,’ or to defeat ‘protective countermeasures,’ of a computer or network-capable device, and performing any of the following: (a) The extraction of data or information, from a computer or network-capable device, or the modification of system or user data; or (b) The modification of the standard execution path of a program or process in order to allow the execution of externally provided instructions.
The notes indicate that protective measures include “Data Execution Prevention (DEP), Address Space Layout Randomization (ASLR) or sandboxing.”
Many have pointed out this definition would cover programs that permit auto-updating without user intervention, such as, for example, the Chrome browser, which updates itself in the background and circumvents protections normally imposed by the operating system to prevent installation or modification of programs without user intercession. Address Space Layout Randomization (ASLR) loads program components into random addresses in memory as a security measure against buffer overflow attacks and yet legitimate programs that must “hot-patch” operating servers or systems must scan memory to locate the program components, thereby both extracting data and defeating ASLR. The definition of sandboxing as a protective measure will subject programs that permit rooting or jailbreaking of mobile telephones to export controls.
I don’t normally try to look into a crystal ball and make predictions about the future, but I see clearly a flood of classification requests by software developers.