LIABILITY FOR USER-GENERATED COMMENTS

Decision by the European Court of Human Rights

The ECHR ruled today that Delfi (an Estonian commercially-run Internet news portal) was liable for the offensive online comments of its readers. Some consider this to be a serious blow to free speech online.

Press release

In today’s Grand Chamber judgment in the case of Delfi AS v. Estonia (application no. 64569/09) the European Court of Human Rights (ECHR) held, by 15 votes to two, that there had been no violation of Article 10 (freedom of expression) of the European Convention on Human Rights.

This was the first case in which the Court had been called upon to examine a complaint about liability for user-generated comments on an Internet news portal. The applicant company, Delfi AS, which runs a news portal run on a commercial basis, complained that it had been held liable by the national courts for the offensive comments posted by its readers below one of its online news articles about a ferry company. At the request of the lawyers of the owner of the ferry company, Delfi removed the offensive comments about six weeks after their publication.

The case therefore concerned the duties and responsibilities of Internet news portals which provided on a commercial basis a platform for user-generated comments on previously published content and some users – whether identified or anonymous – engaged in clearly unlawful hate speech which infringed the personality rights of others. The Delfi case did not concern other fora on the Internet where third-party comments could be disseminated, for example an Internet discussion forum, a bulletin board or a social media platform.

The question before the Grand Chamber was not whether the freedom of expression of the authors of the comments had been breached but whether holding Delfi liable for comments posted by third parties had been in breach of its freedom to impart information.

The Grand Chamber found that the Estonian courts’ finding of liability against Delfi had been a justified and proportionate restriction on the portal’s freedom of expression, in particular, because: the comments in question had been extreme and had been posted in reaction to an article published by Delfi on its professionally managed news portal run on a commercial basis; the steps taken by Delfi to remove the offensive comments without delay after their publication had been insufficient; and the 320 euro fine had by no means been excessive for Delfi, one of the largest Internet portals in Estonia.

In its decision of today the ECHR referred to its own case-law:

52. In a judgment of 23 March 2010 (Joined Cases C-236/08 to C-238/08 Google France and Google) the ECHR considered that in order to establish whether the liability of a referencing service provider could be limited under Article 14 of Directive 2000/31/EC, it was necessary to examine whether the role played by that service provider was neutral, in the sense that its conduct was merely technical, automatic and passive, pointing to a lack of knowledge of or control over the data which it stored. Article 14 of the Directive on Electronic Commerce had to be interpreted as meaning that the rule laid down therein applied to an Internet referencing service provider in the event that that service provider had not played an active role of such a kind as to give it knowledge of or control over the data stored. If it had not played such a role, that service provider could not be held liable for the data which it had stored at the request of an advertiser, unless, having obtained knowledge of the unlawful nature of those data or of that advertiser’s activities, it had failed to act expeditiously to remove or to disable access to the data concerned.

53. In a judgment of 12 July 2011 (Case C-324/09 L’Oréal and Others) the ECHR ruled that Article 14(1) of Directive 2000/31/EC was to be interpreted as applying to the operator of an online marketplace where that operator had not played an active role allowing it to have knowledge of or control over the data stored. The operator played such a role when it provided assistance which entailed, in particular, optimising the presentation of the offers for sale in question or promoting them. Where the operator of the online marketplace had not played such an active role and the service provided fell, as a consequence, within the scope of Article 14(1) of Directive 2000/31/EC, the operator none the less could not, in a case which could result in an order to pay damages, rely on the exemption from liability provided for under that Article if it had been aware of facts or circumstances on the basis of which a diligent economic operator should have realised that the offers for sale in question had been unlawful and, in the event of it being so aware, had failed to act expeditiously in accordance with Article 14(1)(b) of Directive 2000/31/EC.

54. In a judgment of 24 November 2011 (Case C-70/10 Scarlet Extended) the ECHR ruled that an injunction could not be made against an Internet service provider requiring it to install a system for filtering all electronic communications passing via its services, in particular those involving the use of peer-to-peer software, which applied indiscriminately to all its customers, as a preventive measure, exclusively at its expense and for an unlimited period, and which was capable of identifying on that provider’s network the movement of electronic files containing a musical, cinematographic or audiovisual work in respect of which the applicant claimed to hold intellectual property rights, with a view to blocking the transfer of files the sharing of which would infringe copyright.

55. In a judgment of 16 February 2012 (Case C-360/10 SABAM) the ECHR held that Directives 2000/31/EC, 2001/29/EC and 2004/48/EC precluded a national court from issuing an injunction against a hosting service provider requiring it to install a system for filtering information stored on its servers by its service users, which applied indiscriminately to all those users, as a preventive measure, exclusively at its expense and for an unlimited period, and which was capable of identifying electronic files containing musical, cinematographic or audiovisual work in respect of which the applicant for the injunction claimed to hold intellectual property rights, with a view to preventing those works from being made available to the public in breach of copyright.

56. In a judgment of 13 May 2014 (Case C-131/12 Google Spain and Google) the ECHR was called on to interpret Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. It found that the activity of an Internet search engine was to be classified as “processing of personal data” within the meaning of Directive 95/46/EC and held that such processing of personal data, carried out by the operator of a search engine, was liable to affect significantly the fundamental rights to privacy and to the protection of personal data (guaranteed under Articles 7 and 8 of the Charter of Fundamental Rights of the European Union) when the search by means of that engine was carried out on the basis of an individual’s name, since that processing enabled any Internet user to obtain through the list of results a structured overview of the information relating to that individual that could be found on the Internet and thereby to establish a more or less detailed profile of him or her. Furthermore, the effect of the interference with the rights of the data subject was heightened on account of the important role played by the Internet and search engines in modern society, which rendered the information contained in such a list of results ubiquitous. In the light of the potential seriousness of that interference, it could not be justified merely by the economic interest of the operator. The ECHR considered that a fair balance should be sought between the legitimate interest of Internet users in having access to the information and the data subject’s fundamental rights. The data subject’s fundamental rights, as a general rule, overrode the interest of Internet users, but that balance might, however, depend on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information. The ECHR held that in certain cases the operator of a search engine was obliged to remove from the list of results displayed following a search made on the basis of a person’s name links to web pages, published by third parties and containing information relating to that person, even when its publication in itself on the web pages in question was lawful. That was so in particular where the data appeared to be inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they had been processed and in the light of the time that had elapsed.

57. In a judgment of 11 September 2014 (Case C-291/13 Papasavvas) the ECHR found that since a newspaper publishing company which posted an online version of a newspaper on its website had, in principle, knowledge about the information which it posted and exercised control over that information, it could not be considered to be an “intermediary service provider” within the meaning of Articles 12 to 14 of Directive 2000/31/EC, whether or not access to that website was free of charge. Thus, it held that the limitations of civil liability specified in Articles 12 to 14 of Directive 2000/31/EC did not apply to the case of a newspaper publishing company which operated a website on which the online version of a newspaper was posted, that company being, moreover, remunerated by income generated by commercial advertisements posted on the website, since it had knowledge of the information posted and exercised control over that information, whether or not access to the website was free of charge.

With respect to the case at hand the ECHR considered:

153. The Court observes that the Supreme Court stated in its judgment that “[o]n account of the obligation arising from law to avoid causing harm, the [applicant company] should have prevented the publication of comments with clearly unlawful contents”. However, it also held that “[a]fter the disclosure, the [applicant company had] failed to remove the comments – the unlawful content of which it should have been aware of – from the portal on its own initiative” (see § 16 of the judgment as set out in paragraph 31 above). Therefore, the Supreme Court did not explicitly determine whether the applicant company was under an obligation to prevent the uploading of the comments on the website or whether it would have sufficed under domestic law for the applicant company to have removed the offending comments without delay after publication, to escape liability under the Obligations Act. The Court considers that when assessing the grounds upon which the Supreme Court relied in its judgment entailing an interference with the applicant’s Convention rights, there is nothing to suggest that the national court intended to restrict the applicant’s rights to a greater extent than that required to achieve the aim pursued. On this basis, and having regard to the freedom to impart information as enshrined in Article 10, the Court will thus proceed on the assumption that the Supreme Court’s judgment must be understood to mean that the subsequent removal of the comments by the applicant company, without delay after publication, would have sufficed for it to escape liability under domestic law. Consequently, and taking account of the above findings (see paragraph 145) to the effect that the applicant company must be considered to have exercised a substantial degree of control over the comments published on its portal, the Court does not consider that the imposition on the applicant company of an obligation to remove from its website, without delay after publication, comments that amounted to hate speech and incitements to violence, and were thus clearly unlawful on their face, amounted, in principle, to a disproportionate interference with its freedom of expression.

154. The pertinent issue in the present case is whether the national court’s findings that liability was justified, as the applicant company had not removed the comments without delay after publication, were based on relevant and sufficient grounds. With this in mind, account must, firstly, be taken of whether the applicant company had instituted mechanisms that were capable of filtering comments amounting to hate speech or speech entailing an incitement to violence.

155. The Court notes that the applicant company took certain measures in this regard. There was a disclaimer on the Delfi news portal stating that the writers of the comments – and not the applicant company – were accountable for them, and that the posting of comments that were contrary to good practice or contained threats, insults, obscene expressions or vulgarities, or incited hostility, violence or illegal activities, was prohibited. Furthermore, the portal had an automatic system of deletion of comments based on stems of certain vulgar words and it had a notice-and-take-down system in place, whereby anyone could notify it of an inappropriate comment by simply clicking on a button designated for that purpose, to bring it to the attention of the portal administrators. In addition, on some occasions the administrators removed inappropriate comments on their own initiative.

156. Thus, the Court notes that the applicant company cannot be said to have wholly neglected its duty to avoid causing harm to third parties. Nevertheless, and more importantly, the automatic word-based filter used by the applicant company failed to filter out odious hate speech and speech inciting violence posted by readers and thus limited its ability to expeditiously remove the offending comments. The Court reiterates that the majority of the words and expressions in question did not include sophisticated metaphors or contain hidden meanings or subtle threats. They were manifest expressions of hatred and blatant threats to the physical integrity of L. Thus, even if the automatic word-based filter may have been useful in some instances, the facts of the present case demonstrate that it was insufficient for detecting comments whose content did not constitute protected speech under Article 10 of the Convention (see paragraph 136 above). The Court notes that as a consequence of this failure of the filtering mechanism, such clearly unlawful comments remained online for six weeks (see paragraph 18 above).

157. The Court observes in this connection that on some occasions the portal administrators did remove inappropriate comments on their own initiative and that, apparently some time after the events of the present case, the applicant company set up a dedicated team of moderators. Having regard to the fact that there are ample possibilities for anyone to make his or her voice heard on the Internet, the Court considers that a large news portal’s obligation to take effective measures to limit the dissemination of hate speech and speech inciting violence – the issue in the present case – can by no means be equated to “private censorship”. While acknowledging the “important role” played by the Internet “in enhancing the public’s access to news and facilitating the dissemination of information in general” (see Ahmet Yıldırım, cited above, § 48, and Times Newspapers Ltd, cited above, § 27), the Court reiterates that it is also mindful of the risk of harm posed by content and communications on the Internet (see Editorial Board of Pravoye Delo and Shtekel, cited above, § 63; see also Mosley, cited above, § 130).

(…)

162. Based on the concrete assessment of the above aspects, taking into account the reasoning of the Supreme Court in the present case, in particular the extreme nature of the comments in question, the fact that the comments were posted in reaction to an article published by the applicant company on its professionally managed news portal run on a commercial basis, the insufficiency of the measures taken by the applicant company to remove without delay after publication comments amounting to hate speech and speech inciting violence and to ensure a realistic prospect of the authors of such comments being held liable, and the moderate sanction imposed on the applicant company, the Court finds that the domestic courts’ imposition of liability on the applicant company was based on relevant and sufficient grounds, having regard to the margin of appreciation afforded to the respondent State. Therefore, the measure did not constitute a disproportionate restriction on the applicant company’s right to freedom of expression. Accordingly, there has been no violation of Article 10 of the Convention.

In summary, the ECHR ruled that there had been no violation of Article 10 of the European Convention on Human Rights when the State imposed liability on an online news portal for highly offensive comments left by readers of a news item the portal published. In essence, the ECHR takes the view that the news portal could have done more, in terms of filtering and moderating, where various clues to the odious nature of the comments were present.

However, one should also read the joint dissenting opinion of Judges Sajó and Tsotsoria. According to them, the imposition of liability on intermediaries was a major obstacle to freedom of expression for centuries. It was the printer Harding and his wife who were arrested for the printing of the Drapier’s Letters and not the anonymous author (Jonathan Swift), who continued to preach undisturbed. Furthermore, they state that the duty to remove offensive comments without actual knowledge of their existence and immediately after they are published means that the active intermediary has to provide supervision 24/7: “For all practical purposes, this is absolute and strict liability, which is in no sense different from blanket prior restraint. No reasons are given as to why only this level of liability satisfies the protection of the relevant interests. ” Their final comment:

We trust that this is not the beginning (or the reinforcement and speeding up) of another chapter of silencing and that it will not restrict the democracy-enhancing potential of the new media. New technologies often overcome the most astute and stubborn politically or judicially imposed barriers. But history offers discouraging examples of censorial regulation of intermediaries with lasting effects. As a reminder, here we provide a short summary of a censorial attempt that targeted intermediaries.

In Reformation England, the licensing system of the Catholic Church was taken over by the State and it became a State tool for control of all printed publications. Licensing provided the Crown with “censorship prior to publication and easy conviction of offenders”.

These laws cut seditious material off at the place of mass production – the printer. Initially involving prosecution by the Star Chamber, the licensing scheme punished any printer who failed to receive a licence for the material he intended to print (a licence conditional on royal approval). With the abolition of the Star Chamber there was a brief end to the licensing laws during the English Civil War. Parliament, however, did not like the spreading of radical religious and political ideas. It decided to replace Crown censorship with its own, also in order to protect the vested business interests of the printers’ guild. The result was the Licensing Order of 14 June 1643, which reintroduced for Parliament’s benefit the previously despised order of the Star Chamber Decree (pre-publication licensing; registration of all printed material with the names of author, printer and publisher; search, seizure and destruction of any books offensive to the government; and punishment of printers and publishers).

Post-revolution, people tend to reinvent the same tools of oppression that the revolutionaries stood up against. (See also the Alien and Sedition Act enacted in the United States.) The Stationers’ Company was given the responsibility of acting as censor, in return for a monopoly on the printing trade. The licensing system, with the financial interest of the publishers’/printers’ guild, was a more effective censor than seditious libel laws.

This restored licensing system became John Milton’s target in Areopagitica: A Speech for the Liberty of Unlicensed Printing in November 1644. It was resistance to the self-censorship imposed on intermediaries (the printers) that produced Areopagitica, the first and most important manifesto of freedom of expression. Areopagitica attempted to persuade Parliament that licensing had no place in the free pursuit of truth. It argued that an unlicensed press would lead to a marketplace of ideas in which truth might prevail. It could not undo the bigotry of Parliament. We hope that it will have more success today.

Karel Frielink
(Attorney/Lawyer, Partner)

(16 June 2015)
.

In his concurring opinion Judge Zupančič states: “Also, in my opinion, it is completely unacceptable that an Internet portal or any other kind of mass media should be permitted to publish any kind of anonymous comments.
.

Click here for the press release and the decision itself
.

Comments are closed.