Liability of online platforms for content moderation from the perspective of the European Court of Human Rights – challenges and recent developments

Автор: Marijana Mladenov, Tamara Staparski

Журнал: Pravo - teorija i praksa @pravni-fakultet

Рубрика: Articles

Статья в выпуске: 4 vol.41, 2024 года.

Бесплатный доступ

While not a novel phenomenon, online platforms have gained significant economic and societal importance over the past decade, and the public discourse around their responsibilities and liabilities has reached an exceptional level. Online platforms significantly contribute to facilitating the exchange and access to information, enabling the widespread distribution of all types of content, regardless of their legality. The regulation of content on online platforms undoubtedly impacts the protection of human rights, particularly freedom of expression, which has led the European Court of Human Rights (ECtHR) to establish important criteria through its jurisprudence. To understand the implications of the ECtHR’s case law, it is important to briefly present the concept of platform liability within the European legal framework, which is outlined in the opening section of the paper. In the subsequent part, the authors analyze the relevant ECtHR jurisprudence. The aim of the paper is to clarify the main standards of the ECtHR’s approach to the human rights implications of online platforms’ liability for content moderation, while also potentially highlighting their limitations.

Еще

Online platforms’ liability, content moderation, Digital Service Act, European Court of Human Rights

Короткий адрес: https://sciup.org/170206451

IDR: 170206451   |   DOI: 10.5937/ptp2404152M

Текст научной статьи Liability of online platforms for content moderation from the perspective of the European Court of Human Rights – challenges and recent developments

While not a novel phenomenon, online platforms have attained considerable economic and societal significance over the past decade, and the public discourse around their responsibilities and liabilities is increasing to an exceptional level (European Parliament, 2021).2 Online platforms greatly contribute to facilitating information exchange and access, allowing for the widespread distribution of all types of content, regardless of legality. Furthermore, the functions of platforms in the digital domain have transformed from mere hosts to active participants who monitor the distribution and presentation of online content, assuming responsibilities such as moderation, recommendation, and curation (Enarson, 2024, pp. 2–3). Since content moderation requires a framework that allows for a careful balancing of multiple interests, such as the “financial and marketing interests of corporations, the societal responsibilities of social media platforms, and the balancing of individual users’ fundamental rights”, the regulation of this issue has always been complicated (Enarson, 2024, p. 1). The establishment of a comprehensive and effective regulatory framework that addresses the liability of online platforms for content moderation requires the consideration of all the mentioned factors (Schlag, 2023, pp. 168–169).

In this framework, states possess various obligations, both affirmative and prohibitive. It is essential to build well-developed regulatory frameworks for content moderation that safeguard internet users’ rights to exercise and enjoy their human rights, including those victimized by illegal content (Council of Europe, 2021). On the other hand, contractual agreements (also known as “terms of service,” “community guidelines,” etc.) specify the bounds for online platforms of what is and is not allowed in the context of self- and co-regulatory approaches. These documents are typically applied within the framework of self- and co-regulatory rules (Council of Europe, 2021).

Nonetheless, within the regulatory framework, online platforms possess significant flexibility to design and modify their moderation systems within the limits of the existing rules. Platforms can determine the degree and context in which their systems may be automated, as well as whether moderation decisions should be exclusively made by human content moderators or supplemented by automated content moderation systems (Enarson, 2024, p. 3).

The regulation of content on online platforms undoubtedly affects the protection of human rights, especially freedom of expression, leading the ECtHR to establish important criteria in this domain through its jurisprudence. To understand the implications of the ECtHR’s case law, it is important to briefly review the concepts of platform liability within the European Legal Framework, which will be detailed in the initial section of the paper. In the following part, the authors will analyze the relevant jurisprudence of the ECtHR. The scope of the paper is to clarify the main standards of the ECtHR approach toward the human rights implications of online platforms’ liability on content moderation and potentially highlight its limitations.

2.    Liability of Online Platforms on Content Moderation within the European Legal Framework

The DSA establishes a uniform framework of rules regarding the responsibilities and accountability of providers of intermediary services and online platforms, including social media and marketplaces. The objective is to achieve effective harmonization of the legal framework across EU Member States and to ensure high levels of protection for all Internet service users (Korpisaari, 2022, p. 353).

According to Article 6 DSA, a hosting service provider is exempt from liability for content uploaded by third parties as long as it “ upon obtaining such knowledge or awareness acts expeditiously to remove or to disable access to the information ” (Regulation (EU) 2022/2065). This process is known as “notice-and-take-down” due to the fact that, although a hosting service provider is usually exempt from liability, it is required to act upon receiving notification of illegal activity in order to maintain its immunity. Furthermore, Article 8 of the DSA stipulates that providers are not subject to a general responsibility to monitor the information they transmit or maintain (Tuchtfeld, 2023a).

The DSA gives an extremely broad legal definition of what constitutes illegal content. Examples of this category include depictions of child sexual abuse, stalking, unlawful non-consensual sharing of private images, hate speech, and terrorist propaganda. Content that violates EU law or the regulations of any Member State is specified as prohibited. The DSA states that this type of content should be associated with behaviors that are illegal offline as well. However, the statement still leaves open the question of what exactly qualifies as illegal content. The demand for notice and action mechanisms in Article 16 is one method to address this issue. According to Article 16, the DSA mandates that all hosting service providers have notice and reporting procedures for people or organizations to report content that they consider to be illegal (Enarsson, 2024, p. 5).

It seems that the main component of the European regulatory framework on online platform liability for content moderation is the delicate balancing act between the restriction, reduction, or elimination of the dissemination of unlawful or undesirable content and the protection of human rights by offering a forum for information sharing. The method employed to achieve this balance will be further elaborated following the case law of the ECtHR.

3.    Jurisprudence of the ECtHR

The European Convention for the Protection of Human Rights and Fundamental Freedoms (hereinafter: ECHR), as the principal instrument for human rights in Europe and one of the most sophisticated systems globally for human rights protection, offers guidance on implementing liability of online platform content moderation in accordance with the human rights standards set forth by the ECHR.

The development of the relevant standards began with the case of Delfi AS v. Estonia , which was the initial example in which the ECtHR addressed the liability of online platforms for user comments3, alongside the case of Magyar Tartalomszolgaltatok Egyesulete (‘MTE’) and Index.hu Zrt v Hungary , which pertains to a comparable scenario4. Despite the disparate findings of the two cases, both pertain to the liability for offensive or illegal statements made on the platforms.

ECtHR determined in Delfi that Estonia did not violate Article 10 of the ECHR, the right to freedom of expression, by holding an online news platform liable for remarks expressed by its readers that included illegal content. The case related to the comment area of a national online newspaper. However, the liability of social media sites has not been the subject of consideration of the ECtHR in the case. The ECtHR determined that the award of damages was required under Estonian law and fulfilled the legitimate aim of protecting the rights and reputation of others. In addition, the ECtHR concluded that Delfi could be regarded as a “publisher” or “discloser” of the remarks, as it created the electronic framework that allowed defamatory statements and should have been aware that the specific article could incite numerous hostile, threatening comments. The damages verdict of €320 in this case adhered to the stipulation of being “necessary in a democratic society” and did not constitute a breach of the ECHR. The ECtHR emphasized that statements disseminated through conventional print or broadcast media may not be as harmful as defamatory content published online, where it might persist indefinitely (Tuchtfeld, 2023b).

Last year the ECtHR established novel standards in this field in the Sanchez v France case.5 In the Sanchez case, the ECtHR determined that the conviction of a politician for failure to immediately remove remarks made by others on his personally managed public Facebook page did not constitute a breach of his right to freedom of expression. The applicant in this case argued that his right to freedom of expression had been infringed when he was found guilty by French courts for failing to remove offensive remarks from his own Facebook wall. The remarks in question were discriminatory and hateful, encouraging violence or hatred toward the Muslim community.

The ECtHR emphasized that the remarks were unequivocally illegal and directed towards a particular demographic. Moreover, the ECtHR asserts that an essential component of a democratic and pluralistic society is rooted in tolerance and the recognition of the equal dignity of every individual. In the realm of electoral discourse, the latitude for expression is considerable; however, it is relying upon politicians to take a proactive stance against hate speech. The ECtHR concluded that the applicant had assumed a responsibility to oversee the material shared on his Facebook wall upon his decision to make it public and permit his friends to contribute comments. Moreover, his political standing necessitated an even higher degree of vigilance on his part. Notwithstanding this, certain remarks had remained observable for a duration of six weeks. The ECtHR determined that considering the margin of appreciation, there existed important and adequate justifications for the applicant’s conviction (Korpisaari, 2022, pp. 369–370).

In this case, the ECtHR introduced the concept of shared liability among all actors. In addition, the ECtHR emphasizes that social media service providers must exercise some form of moderation, whether it is automatic or not in the following manner:

The Court first observes that there can be little doubt that a minimum degree of subsequent moderation or automatic filtering would be desirable in order to identify clearly unlawful comments as quickly as possible and to ensure their deletion within a reasonable time, even where there has been no notification by an injured party, whether this is done by the host itself (in this case Facebook), acting as a professional entity which creates and provides a social network for its users, or by the account holder, who uses the platform to post his or her own articles or views while allowing other users to add their comments ” (Sanchez v. France, 2023, par. 190).

The ECtHR decision has received some criticism for not complying with EU rules. The expectations following EU law would be contradicted if the ECtHR suggested that the ECHR required liability to be imposed for those who store remarks from third parties who fail to anticipate them. The ECtHR in Delfi merely stated that States may test out various liability models in some contexts involving sensitive criminal content, such as hate speech. For Council of Europe members, however, who are subject to EU legislation that imposes a set of liability exemptions through the ECD and the DSA, this leeway does not exist. It seems that this ruling is ambiguous enough to leave room for a variety of interpretations (Husovec, et al., 2024, pp. 24–25).

In the same year, the ECtHR was asked to determine in the Zöchling v. Austria case whether the publisher of an online news portal may be held liable for hate speech in a news article’s comment area.6 An article concerning the renowned Austrian journalist, the applicant in the case, was published in Zöchling, a right-wing news portal in Austria. Notable insults and threats of death were among the comments left by logged-in individuals in response to this. The applicant received their email addresses back a few days after the news portal removed the comments, which had been removed within hours of the applicant requesting it. The users had also been blocked. However, the email providers wouldn’t give her the individuals’ names or postal addresses, so the identification attempt was unsuccessful. By promptly removing the contested comments upon the applicant’s request, the Vienna Court of Appeal determined that the platform had complied with its duty of due diligence (Tuchtfeld, 2023a).

In this case, the ECtHR expressly restated Delfi and Sanchez’s conclusions, holding that at least some automatic screening or subsequent moderating would be preferable. This could have wider ramifications than the DSA because it would require platform users or platforms themselves to monitor material and take down obviously illegal remarks, even if they don’t notify the platforms or users themselves. When taken together, this indicates that the DSA’s framework and the ECtHR’s case law both call for the moderation or even automatic filtering of certain content, primarily hate speech or calls for violence (Enarson, 2024, p. 11).

The ECtHR highlighted that due to the fact that previous articles concerning the applicant on the platform had generated threatening remarks, the platform might have been able to predict more violations. The ECtHR, therefore, concluded that the State’s procedural requirements under Article 8 ECHR were violated by the lack of any balance of the opposing interests at issue.

It seems that the demand for automatic filtration by the ECtHR has the potential to completely transform Europe’s long-standing platform liability framework. However, the attention should be driven to the fact that this was not the main concept of the ECtHR reasoning. Furthermore, this demand was noted only in the cases referring to the news portal. It could be argued, therefore, that the ECtHR just established specific sector standards for news portals rather than a comprehensive new framework for platform liability. On the other hand, this would be diametrically opposed to the EU’s DSA strategy. Here, news portal comment sections are an example of an auxiliary feature that shouldn’t result in online platform services having to abide by legal requirements. The future stance of the ECtHR on platform liability and its potential skepticism regarding automated filtering technologies remains uncertain. The present circumstances prevent the advancement of alternatives to the prevailing social media platforms, rely on private surveillance, and permit censorship methods (Tuchtfeld, 2023a).

4.    Conclusion

Content moderation serves as a tool for tackling various difficulties, such as the fight against cybercrime, other online offenses, and potentially offensive content for specific audiences. An in-depth comprehension of these issues indicates that although a universal solution is often possible, it is infrequently favored. Eliminating an online post restricts a user’s human rights, irrespective of the problem that content moderation aims to address.

Consequently, it must be executed in a predictable, suitable, necessary, and reasonable manner. Furthermore, states should not assume that online platforms are neutral or adequately equipped to determine the legality of content. It is essential to recognize that no content filtering system is infallible and that these limitations may arise from governmental regulations, private decisions by internet intermediaries, or a mix thereof.

According to the ECtHR, the article’s most evident conclusion is the uncertainty of the demands regarding the liability of online platforms to protect users’ fundamental rights while carrying out appropriate content moderation. The ECtHR has recently made concerning statements regarding its perspective on content moderation in the digital domain. The conducted analysis of the ECtHR rulings may be interpreted as establishing a positive obligation for states to require platforms to monitor their systems for unlawful content posted by third parties, which would essentially contradict the legal framework developed by the European Union. On the other hand, as previously indicated, the analyzed rulings appear sufficiently ambiguous to allow for multiple interpretations. In the meantime, we should wait for the future position of the ECtHR on platform liability and its possible more precise approach toward automated filtering technologies.

After all, it appears that the issue of liability regarding content moderation on online platforms reflects an ongoing struggle for optimal democratic governance within a society increasingly dominated by these platforms.

ACKNOWLEDGEMENT

The paper is the result of the research project “ Progressive Development of Law in the Modern Digital Society ” [“ Progresivni razvoj prava u savremenom digitalnom društvu ”], funded by the Provincial Secretariat for Higher Education and Scientific Research (decision no. 142-451-3484/202302, dated November 21, 2023).

Mladenov Marijana

Univerzitet Privredna akademija u Novom Sadu, Pravni fakultet za privredu i pravosuđe u

Novom Sadu, Novi Sad, Srbija

Staparski Tamara

Univerzitet Privredna akademija u Novom Sadu, Pravni fakultet za privredu i pravosuđe u

Novom Sadu, Novi Sad, Srbija

ODGOVORNOST ONLAJN PLATFORMI ZA MODERIRANJE SADRŽAJA IZ UGLA EVROPSKOG SUDA ZA LJUDSKA PRAVA

– IZAZOVI I NEDAVNA DOSTIGNUĆA

APSTRAKT: Iako ne predstavljaju novi fenomen, onlajn platforme su postigle značajan ekonomski i društveni značaj tokom protekle decenije, a javni diskurs o njihovim odgovornostima i obavezama dostiže izuzetan nivo. Onlajn platforme u velikoj meri doprinose olakšavanju razmene informacija i pristupa istim, omogućavajući široku distribuciju svih vrsta sadržaja, bez obzira na njihovu legalnost. Regulisanje sadržaja na onlajn platformama nesumnjivo utiče na zaštitu ljudskih prava, posebno na slobodu izražavanja, što je dovelo do toga da Evropski sud za ljudska prava (ESLJP) kroz svoju jurisprudenciju utvrdi važne kriterijume u ovoj oblasti. Da bismo razumeli implikacije sudske prakse ESLJP, važno je sažeto prikazati koncepte odgovornosti platforme u okviru evropskog pravnog okvira, koji je razrađen u početnom delu rada. U narednom delu autori analiziraju relevantnu jurisprudenciju ESLJP. Cilj rada jeste da razjasni glavne standarde pristupa ESLJP prema implikacijama odgovornosti onlajn platformi za moderiranje sadržaja na ljudska prava i potencijalno istakne njihova ograničenja.

Ključne reči: odgovornost onjaln platformi, moderiranje sadržaja, Akt o digitalnim uslugama, Evropski sud za ljudska prava.

Список литературы Liability of online platforms for content moderation from the perspective of the European Court of Human Rights – challenges and recent developments

  • Council of Europe. (2021). Freedom of Expression: News Guidance Note on Content Moderation. Downloaded 2024, July 15 from https://www.coe.int/en/web/freedom-expression/-/guidance-note-on-content-moderation.
  • Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ('Directive on electronic commerce'), OJ L 178 of 17.07.2000.
  • Enarsson, T. (2024). Navigating hate speech and content moderation under the DSA: insights from ECtHR case law. Information & Communications Technology Law, pp. 1–18. https://doi.org/10.1080/13600834.2024.2395579.
  • European Parliament. (2021.) Liability of online platforms. Downloaded 2024, July 15 from https://www.europarl.europa.eu/RegData/etudes/STUD/2021/656318/EPRS_STU(2021)656318_EN.pdf.
  • Frosio, G., & Geiger, C. (2023). Taking fundamental rights seriously in the Digital Services Act's platform liability regime. European Law Journal, 29(1-2), pp. 31-77. https://doi.org/10.1111/eulj.12475.
  • Husovec, M., Grote, T., Mazhar, Y., Mikhaeil, C., Escalona, H. M., Kumar, P. S., & Sreenath, S. (2024). Grand confusion after Sanchez v. France: Seven reasons for concern about Strasbourg jurisprudence on intermediaries. Maastricht Journal of European and Comparative Law. https://doi.org/10.1177/1023263X241268436.
  • Korpisaari, P. (2022). From Delfi to Sanchez–when can an online communication platform be responsible for third-party comments? An analysis of the practice of the ECtHR and some reflections on the Digital Services Act. Journal of Media Law, 14(2), pp. 352-377. https://doi.org/10.1080/17577632.2022.2148335.
  • Lučić, S. (2022). The Operator’s Liability for Copyright Violations Committed by Users of its Platform. Pravo teorija i praksa, 39(4), pp. 109-123. https://doi.org/10.5937/ptp2204109L.
  • Mladenov, M., & Staparski, T. (2022). Human Rights Approach to Internet Access with a Special Emphasis on the Case-Law of the European Court of Human Rights. Revija za evropsko pravo, 24(1), pp. 23-38. Downloaded 2024, July 15 from http://revija.pravoeu.org/index.php/REP/article/view/54.
  • Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), OJ L 277 of 27.10.2022.
  • Schlag, G. (2023). European Union’s regulating of social media: A discourse analysis of the digital services act. Politics and Governance, 11(3), pp. 168-177. https://doi.org/10.17645/pag.v11i3.6735.
  • Tuchtfeld, E. (2023a). Be Careful What You Wish For: The Problematic Desires of the European Court of Human Rights for Upload Filters in Content Moderation, VerfBlog, Downloaded 2024, August 17 from https://verfassungsblog.de/be-careful-what-you-wish-for/.
  • Tuchtfeld, E. (2023b). Case law on content moderation and freedom of expression. Columbia Global Freedom of Expression. Downloaded 2024, August 19 from https://globalfreedomofexpression.columbia.edu/wp-content/uploads/2023/06/GFoE_Content-Moderation.pdf.
  • Turillazzi, A., Taddeo, M., Floridi, L., & Casolari, F. (2023). The digital services act: an analysis of its ethical, legal, and social implications. Law, Innovation and Technology, 15(1), pp. 83–106. https://doi.org/10.1080/17579961.2023.2184136.
Еще
Статья научная