The recent release of the Joint Committee on Privacy and Injunctions’ report  gives an interesting indication of the future of enforcement of court orders relating to online activity. The document covers a broad range of topics but its section on ‘Online Enforcement’ is key to the technology sector.

Incompatible I’s, Injunctions and Internet

The Committee noted that, at present, individuals seeking to have content removed from online services are actively required to request its removal by the service provider. As many content platforms require a separate notice for each offending article, this can mean that individuals have to make a fresh request each time an infringement occurs. This may even require multiple visits to court in order to suppress offending behaviour.

This can be the case even where the offending content is identical in each instance (for example, ‘sexting’ photos released by ex-partners, which are often widely shared, commented on, and re-posted via services such as Tumblr and Pinterest). As a result, individuals may have to retain expensive PR and/or legal representatives to continuously police their reputation/presence online.

The above problem is exacerbated by the fact that as content proliferates across the internet it may be hosted on servers in multiple jurisdictions.

The Committee considered this process to be unfair on claimants and found it unacceptable that they might be forced to return to court repeatedly in order to procure removal of the same material from various parts of the internet.

Where individual ‘gagging orders’ fail, is blindfolding really the answer?

The Committee proposed to address the issue by requiring search engines to filter out specific content from their users’ results, which effectively amounts to requiring search engines to act as censors of offending material.

In its response to the Committee, Google expressed reluctance to act as a censor in such a fashion, but acknowledged that it was technically possible to develop the technology to monitor websites proactively for offending material and to filter them from searches.  The Committee recommended that if legislation was necessary to require search providers to develop and implement that technology then it should be introduced.

The suggestion appears flawed for several reasons. The most obvious is that making search engines responsible for censorship would almost certainly contravene Section 17 of The Electronic Commerce (EC Directive) Regulations 2002, which protects ‘mere conduits’ of information (including search engines) from liability for the content of the data they transfer. See also recent blog.

Further, introducing legislation to make it harder for individuals to find defamatory material online would not remedy the underlying problem, as the offending content would still exist (and proliferate) online. Forcing search engines to censor themselves would only serve to cut off a single route of access to offending data; increasing the regulatory burden for search providers without effectively restricting access.