In 2019, it was big news in the eSports world that well-known player, "Ninja", had his Twitch platform used to promote third parties' pornography streams without his knowledge or consent. The situation serves as a reminder that associating a person with images can carry a defamatory meaning and cause serious reputational harm. Defamation isn't confined to damaging words. In this article we explore key defamation judgments that publishers should keep in mind when assessing their risk.
- The potential for publishers to be found liable for defamation as a result of a defamatory photographs and imagery (without words) has been confirmed by the courts on numerous occasions.
- In Ninja's case the publication may have been inadvertent, but the potential reputational harm is, nonetheless, significant.
- Online personalities including eSports gamers, YouTube commentators and Instagram influencers should consider what rights and controls they have over their persona and presence on online platforms.
- Digital platforms and other online hosts should keep in mind that defamation is much broader than words. Further, platforms are regularly being found liable for third parties' defamatory content.
The story of Ninja, Twitch and the pornography slip
Ninja is a world-famous eSports gamer.
For the non-gamers out there, eSports refers to organised competitive computer and console games. In the past, eSports were largely played between amateurs. But in recent years popularity has surged and highprofile players of games like Fortnite and DOTA get paid anywhere up to $20,000 per hour by online platforms to set up a channel and stream their matches. Ninja is one of those high-profile players, with over 14 million followers of his channel on the Twitch platform, which is owned by Amazon.
Earlier in 2019 after announcing his move to rival platform Mixer, which is owned by Microsoft, Ninja's Twitch channel sat inactive showing only his old matches. Twitch then decided to take advantage of Ninja's huge following by experimenting with using it to promote other streaming channels -- the first of which happened to be pornography.
Ninja rushed to Twitter and apologised to fans, many who are as young as 13, saying he no longer had control of the channel. Twitch also apologised, shut down the pornography stream (which was in breach of its user Terms & Conditions) and returned Ninja's channel to its dormant state. However, the damage was already done, with about one million Twitch users having viewed the channel at the time.
Ninja clearly wasn't happy with Twitch associating his account with pornography, which begs the question -- was Ninja defamed?
Would defamation law apply?
In the USA, it's settled law that public figures can't sue for defamation unless they can prove the conduct was motivated by "actual malice". Ninja's notoriety in the eSports world, and Twitch's strong claim of inadvertence, would probably be enough to shut down any claim there.
But Ninja has followers all over the world including a huge number in Australia, where no such defamation threshold exists. And it's established law that plaintiffs can sue for defamation over internet material accessed in Australia, irrespective of where the material was uploaded or hosted.1
Can images be defamatory?
Defamation isn't just about harmful or false words. There are no limits to the kinds of publication that might invoke defamation law; defamation can include associating someone with a concept via photos, videos or other imagery. Reputation is a complex concept and there are a lot of different ways it can be damaged.
In Ninja's case, Twitch linking Ninja with a pornography stream (whether inadvertent or not) may well be defamatory. The natural, ordinary interpretation of the publication of pornography on his Twitch page could readily imply that he was involved in, or endorsed, the publication of the pornography. There is a range of imputations that arguably arise including that Ninja:
- knowingly allowed his Twitch platform to be used to disseminate pornography
- endorses the public display of pornography, or
- more seriously, condones the display of pornography to children
Many viewers might have mistakenly assumed that Ninja had consented to his channel page being used to promote other streamers and may even have assumed Ninja had selected the channel to stream on his page.
There are a number of cases that establish the potential defamatory nature of images and nudity or sexual content of which publishers should be aware.
Ettingshausen v Australian Consolidated Press Ltd2 (Ettingshausen) involved Australian rugby league footballer, Andrew Ettingshausen, and two HQ magazine journalists who were given permission to accompany the Kangaroos on a tour with unrestricted access. After the tour, HQ published an article titled "Hunks" that included a shadowy black-and-white image of three Kangaroos players in the shower including Ettingshausen. While the other two players were photographed side-on and not fully exposed, Ettingshausen is pictured standing with his arms folded and his back against the wall seemingly facing towards the camera although he was unaware of the presence of the photographers. Ettingshausen commenced defamation proceedings and successfully established that defamatory imputations arose that he:
- deliberately permitted a photograph to be taken of him with his genitals exposed for the purposes of reproduction in a publication with a widespread readership, or
- is a person whose genitals have been exposed to the readers of the HQ magazine, a publication with a widespread readership
The court considered whether the second imputation, in particular, was capable of being defamatory where it did not seek to assert any moral blame on Ettingshausen for his exposure in the magazine. Ultimately, both imputations were found to have the propensity to cause him to be ridiculed, shunned and avoided (whether Ettingshausen was blameless or not). The jury initially awarded Ettingshausen damages at $350,000. That went on appeal and was reduced to $100,000 by a fresh jury.3 Ettingshausen said the case demonstrated emphatically that he was not the kind of person who would ever pose nude for a magazine.
In a New Zealand case, Taylor v Beere4 (Beere), Beere published a photograph of Taylor and her grand-daughter (sitting in a loungeroom) without Taylor's consent in a book called "Down Under the Plum Trees". The court described the book as a "manual of sex instruction" aimed at school children, that was ultimately classified as "indecent" by the New Zealand Indecent Publications Tribunal. Taylor sued for defamation and the court found an imputation arose that she was the kind of person willing to approve and be associated with an indecent document. Taylor was awarded NZ$12,500 in damages, where only a small number of the books were printed.
In a UK case, Charleston v News Group Newspapers,5 the plaintiffs were the actor and actress who played Harold and Madge Bishop in popular Australian television series "Neighbours". Their complaint related to a nearly naked photograph of a man and woman engaged in a sexual act that has the faces of the actors superimposed on the bodies. However, unlike in Ettingshausen or Beere, the text underneath the photograph made it clear that the images had been produced by the makers of a pornographic computer game without the knowledge or consent of the plaintiffs. The House of Lords found that a defamatory imputation, to the effect that the plaintiffs had willingly participated in the production of pornographic photographs, was not capable of arising. The defamation claim failed. In that case, the potential defamatory imputation associated with the images alone was offset by the language accompanying the picture. In all cases the court will look at the overall impression of the publication, rather than the impression of individual elements of the publication.
The law recognises a broad range of publication types that don't necessarily involve harmful or false words that may give rise to defamation. Ninja might have a strong claim for defamation in Australia as a result of his online personality being associated with pornography.
Online personalities including eSports gamers, YouTube commentators and Instagram influencers should consider what rights and controls they have over their persona and presence on online platforms for when something goes wrong.
Platforms and other online hosts like YouTube and Facebook should keep in mind that defamation is much broader than words. This is particularly relevant where filters being used by Facebook page operators and other online forums are often targeted at particular words and phrases and would not be capable of easily picking up defamatory images.
Increasingly, platforms are being found to be involved in the publication of defamatory content in such a way as to render them liable as primary publishers.6 In Voller v Nationwide News Pty Ltd; Voller v Fairfax Media Publications Pty Ltd; Voller v Australian News Channel Pty Ltd7 (Voller), Facebook page owners were responsible for defamatory comments even before they had actual knowledge of the comments, on account of their apparent ability to vet those comments. This was the case notwithstanding the complicated process that page owners had to undertake in order to vet most or all comments. The only way to do this is for them to craft filters that would aim to capture all comments, by using words like "a", "the", "is", etc. Filtered comments can then be reviewed to decide whether to release them for public view. This process would not allow page owners to vet images or videos.
The implication of the Voller decision is that notice and takedown regimes may not be sufficient as a means of risk mitigation. In Ninja's example, where Twitch was involved in allowing his page to be used to promote other user's content, it could well be deemed responsible as a primary publisher from the first moment of the pornography's publication.