Is a deep learning AI system which recommends similar songs to its users patentable? Or is it a computer program ‘as such’ and therefore excluded under s.1(2)(c) Patents Act 1977? The High Court has recently provided an answer (overturning the UKIPO’s original decision): such systems are patentable. In a decision which will delight AI innovators, Justice Mann held that the invention in question did not even constitute a computer program, let alone a computer program ‘as such’. Even if the invention was a computer program, Justice Mann considered that it had a technical effect, taking into account the Aerotel stages, AT&T signposts and Protecting Kids and Haliburton case law.
What is the invention for?
The purpose of Emotional Perception’s invention is to identify and provide similar songs* to its users (based on the song provided by the user). It does so by training the system on both the semantic (happy, sad, relaxing) and musical properties (tone, timbre, beats per minute) of the song. The training data for the model consists of numerous pairs of music files, one labelled with semantic properties and the other labelled with musical properties. Each file is fed into a respective semantic or musical artificial neural network (“ANN”) and, through the wonders of deep learning, each ANN learns to cluster similar songs closer together. Then comes the additional clever component: the similarity scores of the musical ANN are adjusted to incorporate the results of the semantic ANN – if two songs are incredibly similar semantically, they will be moved closer together in the musical ANN (and vice versa). This process leads to, eventually, one ANN incorporating both semantic and musical similarity. Once complete, the ANN is frozen and ready for use by users.
How was the original case decided and what was the basis of the appeal?
In the original decision, the Hearing Officer held that the invention was not patentable. This was on the basis that it was a computer program ‘as such’, applying the four Aerotel stages and the AT&T signposts. The Hearing Officer concluded that the invention’s contribution was “no more than a computer programming activity” and that the recommendation of semantically similar files to users had no “technical effect over and above the running of a program on a computer”.
Emotional Perception appealed this decision on the basis that:
- The computer program exclusion is not engaged at all as there is no relevant computer program; and
- If there is a computer program, the invention makes a technical contribution and is therefore not a computer program ‘as such’.
Was the invention a computer program?
In short, Justice Mann held that the invention was not a computer program. Note that the invention is wide enough to cover both a hardware ANN system and a computer emulation ANN equivalent (i.e. a virtual recreation of the hardware ANN run on software). The Comptroller conceded that, if the patent had covered only hardware ANNs, it would not be excluded under s.1(2)(c). However, was the emulation ANN a computer program?
The Parties accepted that the initial training involved a computer program, with a programmer providing instructions on how the ANN should handle the training data. However, once trained, the ANN would be simply applying its own weights and biases to produce outputs: the ANN provided outputs independently and would not be relying on any instructions. The Comptroller sought to argue that the use of emulation software meant the invention was a computer program. This was rejected. Justice Mann agreed that the ANN was distinct and separate from the underlying software on the computer which facilitated the emulation.
Did it matter that the invention made use of a computer program for training? No. Justice Mann held that this program was just one of many aspects of the invention, with the claimed invention going well beyond the mere training program.
On this basis, Justice Mann concluded that the invention was not a computer program at all and therefore the exclusion under s.1(2)(c) of the Patents Act 1977 did not apply.
What if the invention was a computer program? Was it a computer program ‘as such’?
Perhaps sensing that an appeal might arise, Justice Mann went on to consider (obiter) whether the invention achieved a technical effect assuming it was a computer program. He held that it did.
Emotional Perception aligned its suggested technical contribution, namely the provision of improved song recommendations, closely with the facts in Protecting Kids. The patent in Protecting Kids covered an invention which could (1) identify inappropriate content being viewed on a computer (by a child) and (2) send a notification to an adult’s mobile device so that an appropriate response could be sent back to the computer. The improved monitoring of content on electronic communications was held to be a technical effect.
Justice Mann agreed with Emotional Perception that, like in Protecting Kids, there was an external (outside world) effect through the provision of a music file to the end user. The Hearing Officer had previously held that any effect was not technical because any beneficial effect in receiving similar songs was subjective and in the mind of the end user. Mann J disagreed, concluding that the end user’s subjective view was irrelevant: the ANN still produced a semantically-similar song to the user which it would not otherwise have done.
This decision will no doubt be welcomed by inventors in the AI space. The application of the Aerotel stages to specific facts, even with the AT&T signposts and body of developed case law, can still be unpredictable and at times confusing. The decision is notable for the fact that it does not mention any European Patent Office case law despite s.130(7) Patents Act 1977 stating that the provisions in the Patents Act 1977 “are so framed as to have, as nearly as practicable, the same effects in the United Kingdom as the corresponding provisions of the European Patent Convention”. In particular, the Enlarged Board of Appeal decision of the EPO in G1/19 (pedestrian simulation) could be argued to diverge from the Haliburton case law (but this was not discussed).
Paragraphs 60 and 61 of Justice Mann’s decision are particularly interesting. The patent claim included the wording “training the ANN by …” and “in a backpropagation process in the ANN…”. It was accepted by both parties that a computer program is needed to train an artificial neural network. Even so it was found that the claim was not to a computer program because “the actual program is a subsidiary part of the claim and is not what is claimed” (paragraph 61).
The decision also prevents a potential IP void in protection for ANNs. It should be recognised that a primary reason for the s.1(2)(c) exclusion is to avoid an overlap with copyright protection. While still an undecided point in copyright law, a developer may struggle to convince the courts that an ANN (including its weights and biases) are the expression of their own intellectual creation (required for copyright subsistence). On the other hand, given the potentially wide-reaching implications of the decision, it would not be surprising if an appeal was filed. We will wait and see.
* The invention includes other media forms but, for simplicity, we refer to music only here.
Emotional Perception AI Ltd v Comptroller-General of Patents, Designs and Trade Marks  EWHC 2948 (Ch)
Aerotel Ltd v Telco Holdings Ltd  RPC 7
T Knowledge Venture v Comptroller of Patents  FSR 19
Re Protecting Kids The World Over (PKTWO) Ltd  EWHC 2710 (Pat)
Re Halliburton Energy Services Inc  EWHC 2508 (Pat)