The operator’s liability for copyright violations committed by users of its platform

Автор: Sonja Lučić

Журнал: Pravo - teorija i praksa @pravni-fakultet

Рубрика: Review paper

Статья в выпуске: 4 vol.39, 2022 года.

Бесплатный доступ

The Court of Justice of the European Union has recently issued a judgment in the joined cases C-682/18 (YouTube) and C-683/18 (Cyando) relating to the operator’s liability for copyright infringements committed by users of its platform within the meaning of Art. 3, paragraph 1 of Directive 2001/29 on the information society. In the cited cases of the Court of Luxembourg, there are two specific platforms being concerned: the popular video-sharing platform (YouTube) and the file hosting and sharing platform (Uploaded). The judgment was passed almost a year after the public defender’s opinion had been published. In the meantime, exactly since June 7th 2021, a new liability regime for copyright infringement for certain internet platforms came into effect (Article 17 of Directive 2019/790 on copyright in the single digital market). Although the judgment was passed two weeks after Art. 17. Directive 2019/790 had entered into force, it was of great importance, especially considering the fact that on one hand, not all EU member states had implemented Art. 17 of Directive 2019/790, and on the other hand, the EU, through the Digital Services Act, was trying to modernize European regulations concerning the platforms regulation. In the paper, the author has, after referring to art. 17 of Directive 2019/790, analyzed the judgments in the combined cases of YouTube and Cyando, as well as the judgment of the Court of Justice of the EU regarding Poland’s claim for annulment of Art. 17 of Directive 2019/790. The analysis of the judgment has shown that the regulations on copyright should establish a balance between the rights holders protection on one side, and exercising the basic rights such as freedom of speech on the other side.

Еще

Video-sharing platforms, the operator’s liability, copyright, Directive 2019/790, filtering

Короткий адрес: https://sciup.org/170202166

IDR: 170202166   |   DOI: 10.5937/ptp2204109L

Текст научной статьи The operator’s liability for copyright violations committed by users of its platform

Digital transformation is one of the central challenges of the 21st century. An essential part of this digital transformation is the emergence of the “platform economy” (Busch, 2019, p. 788). Digital platforms exist in many areas: online commerce platforms, travel and mobility platforms, and social networks. Today’s global market is dominated by powerful platforms such as Amazon, Facebook, Google, and Twitter. Even smaller, nationally oriented platforms have a dominant role in certain countries. As intermediaries, all these platforms provide access to information, allow transactions, and enable far-reaching interactions. As a rule, platforms with a focus on end consumers (Business-to-Consumer/B2C) and platforms for cooperation between companies are distinguished (Business-to-Business/B2B). Platforms, therefore, have their pros and cons (Ohly, 2015, p. 308). On the one hand, they open access to a large number of different contents and create different opportunities for user participation. On the other hand, platforms as intermediaries in the digital world create preconditions for non-compliance with legal regulations, i.e. mass violation of rights by third parties.

In addition to the existing online platforms, new ones are constantly appearing. The spectrum of services offered by the platforms can be divided into several categories: online markets (offer of goods from the host provider or users), rating portals, multimedia portals (especially for streaming and file sharing), social networks and platforms with user-generated content, for example a blog (Aras, 2020, p. 12). In the case of violation of the law by usergenerated content, the question of liability becomes specific depending on the type of platform. This concerns infringement of trademark or competition regulations, infringement of copyright and personal rights.

The European legislator and judicial practice had a special role in balancing the interests of the opposing parties. At the beginning of the development of the commercial Internet, the legislator intended to create new business models that would not be undermined by strict liability regulations (Ohly, 2015, pp. 310, 313). The development of electronic commerce in the information society was aimed at increasing employment, especially in small and medium-sized enterprises. In this sense, Directive 2000/31 on electronic commerce aimed to stimulate economic growth, the establishment of companies with new business models, as well as investment in European companies. Hence the Directive in Art. 12 provided the conditions under which the intermediary is not responsible for the services of the IT society. In this way, the European legislator has privileged hosting, caching, and mediating services with different contents. The directive is also in art. 15 provided for the absence of an obligation to monitor information for information society service providers. However, this general legal framework created certain problems in interpretation.

Changes came with Directive 2004/48 on the enforcement of intellectual property rights and with Directive 2001/29 on the information society. Both directives have provided rules on the responsibility of platform operators, so that the person, whose intellectual property rights have been violated through the platforms, enjoys effective legal protection. Apart from these two directives, the proposal of the Regulation on preventing the spread of terrorist content on the Internet and the reform of Directive 2018/1808 on audiovisual media services are also important on the European level. This phase of legislative activity is characterized by the distinction between deletion and filtering obligations.

In the third phase of European legislative activities, the rights of users came to the fore. The fact that the legal obligations of platform operators result in an “outflow” of internet users has resulted in protests by “internet communities” against the legal framework of platform operator responsibilities. Behind the expression “excessive strengthening, Engl. overenforcemen”, “excessive blocking, engl. overblocking” or “intimidating effect, Engl. chilling-effect”1 states the opinion that in many cases even legitimate contents become victims of deletion (Becker, 2019, p. 636). If suspected illegal content (false positive) is detected on the platform, the platform operator will try to delete such content as a precaution in order to reduce the risk of own liability. Such actions do not only affect the freedom of communication of active users, but also the freedom of information of passive users. In this sense, more and more people were thinking about how this “excessive strengthening” or “excessive blocking” could be solved (Specht, 2017, p. 114).

The proposals ranged, for example, from put-back requests (Raue, 2018, p. 961) to the establishment of user rights (Specht-Riemenschneider, 2020, p. 88), that is, to further state monitoring of the internal procedure of deleting and blocking the platform (Wagner, 2020, p. 447). The pinnacle in regulating the responsibility of platform operators is Art. 17. Directive 2019/790 which guarantees extensive rights of users and the Internet and procedural protection. This regulation refers to copyright and related rights, but it could also serve to guarantee the rights of Internet users (Wagner, 2020, p. 451).

2.    Article 17 of Directive 2019/790 on copyright on the single digital market

Directive 2019/790 on copyright and related rights in the digital single market was adopted in May 2019. EU member states have to implement it until June 7, 2021. Behind the reform and the related introduction of upload filters was the reconciliation of property rights and diversity of opinion. With this directive, the EU tried to preserve copyright protection, but also to adapt to the development of the digital age. The use of technology has various effects on the interpretation, application and enforcement of copyright. So-called web browsers, such as upload filters, smart contracts and technical protection measures currently represent the most important technical tool with an indirect impact on copyright.

Platform operators are not obliged to check whether all posted content violates other people’s copyrights. They only need to conduct a more detailed search of the content they have been actively informed about. In other words, if rights holders want to prevent the distribution of their works through online platforms, they must take action themselves and provide operators with information about the copyrighted content. In practice, most rights holders will not do this preventively, since such activity is associated with high economic costs.

The third cumulative condition (Art. 17, para. 4c) is the obligation of the service provider to take immediate measures to block access to the relevant content or to delete the relevant content on the website after receiving sufficient substantiated information from the right holder. In order to implement this obligation, service providers must check all uploaded content, i.e. to check if copyright infringement has already been reported in relation to it. The Upload Filter would be useful for performing this check. However, the Directive expressly stipulates that the application of Art. 17 must not lead to any general monitoring obligation (Art. 17, paragraph 8 of Directive 2019/790).

In the new liability regime of online platforms, start-up companies have a special position (Wandtke, 2019, p. 1841). If service providers have been available for less than three years and have an annual turnover of less than 10 million euros, they do not have to fulfill the obligation from Art. 17th century 4 (b) of Directive 2019/790. On the one hand, however, such companies are still obliged to make every effort to obtain the permission of the rights holders. On the other hand, part of the obligation from Art. 17th century 4 (c) is for platform operators to take appropriate actions to block access to content that infringes the copyright or to remove this content from the website.

The implementation of upload filters is practically exempt, as they have no obligation to preemptively review uploaded content, nor to prevent the reupload of infringing material.

It should be noted, however, that after three years and less, platforms are subject to full liability under Article 17, which means that if they want to avoid any liability, they must cumulatively fulfill all the conditions from Art. 17, paragraph 4 of Directive 2019/790 (Dreier, 2019, p. 771).

New companies that have been available for less than three years and have a turnover of less than 10 million euros, but have a number of monthly visitors greater than five million, also do not meet the requirements of Art. 17, paragraph 4(b).

3.    Action for annulment of Art. 17 of Directive 2019/790

EU copyright reform has always been controversial. However, despite great resistance, Directive 2019/790 was adopted on April 15, 2019. Member states have been given a deadline of June 7, 2021, to implement the provisions of this Directive. The most controversial provision of Directive 2019/790 is Art. 17 which for many EU member states shows the danger of upload filters and thus the limitations of free internet and freedom of expression. Poland also shared this opinion and therefore filed a lawsuit in May 2019 for the annulment of Art. 17 of Directive 2019/790.

The subject of the lawsuit was actually certain parts of Art. 17 of Directive 2019/790 (para. 4 b, c), with the aim of proving its incompatibility with fundamental rights, especially with the right to freedom of expression and information guaranteed by Art. 11 of the Charter of Fundamental Rights (case C-401/19). Poland has actually argued that these provisions require online content-sharing service providers to install tracking and filtering technology, which the Polish government believes would prevent legal posting and violate the essence of freedom of expression. Although only parts of Art. 17 of

Directive 2019/790, the lawsuit has a wider significance because it specifies that “if the Court finds that the challenged provisions cannot be deleted from Article 17 of Directive 2019/790 without fundamentally changing the rules contained in the remaining provisions of that article, the Court should annul the article 17 of the Directive as a whole”. This is important, as Article 17 as a whole raises significant concerns about whether this provision is compatible with the rights of the EU Charter of Fundamental Rights, but also with the fundamental principles of EU law, such as proportionality and legal certainty.

One of the main counterarguments that the Polish government stated in the lawsuit is that 17, para. 4 of Directive 2019/790 and the obligations it imposes on individual platform operators, leave them with no choice but to install a preventive control mechanism through automated filtering of content posted by users in order to avoid liability. A hearing held before the Court of Justice of the European Union on 10 November 2020 raised several questions and also exposed different understandings of how Article 17 should work, even among supporters of the provision.

In mid-July 2021, the opinion of the Advocate General on this case was published, in which the limits of permissible filtering of uploading (uploading) of content on the Internet by users were set. This opinion could be crucial for understanding the model of application of the rather contested provision of Art. 17 of Directive 2019/790. Immediately before the publication of the Advocate General’s opinion, the European Commission published an instruction regarding the disputed provision of Directive 2019/790. According to the opinion of the Advocate General, the controversial Art. 17 of the EU Copyright Directive in the Single Digital Market is compatible with freedom of expression and information. A regulation that requires platforms to either enter into licensing agreements with rights holders or to check content before publication in order to avoid liability hinders freedom of expression but meets the requirements of the EU Charter of Fundamental Rights.

In a judgment dated April 26, 2022, the Court of Justice of the EU followed the opinion of the Advocate General and rejected Poland’s lawsuit. In its ruling, the Court of Justice first stated that platforms are indeed required to use so-called upload filters in order to absolve themselves of liability. However, the pre-screening and filtering that this entails tend to limit an important means of disseminating content on the Internet. This liability regulation also leads to restrictions on the rights of platform users to freedom of expression and information. Yet, this limitation is proportionate, although the Court also recognized the danger of excessive blocking. In this regard, the Advocate General pointed out in his opinion that providers of online sharing services, such as YouTube, Instagram, or TikTok, could tend to systematically block the upload of all content in order to avoid any risk of liability to rights holders, which reproduce the subject specified by the rights holders. However, the content that contained permitted exceptions to copyrighted material, i.e. that could be legitimately shared, could also be blocked. The use of automatic content detection tools increases this risk because filters are unable to understand the context in which copyrighted material is exceptionally lawfully reproduced. Although, according to the opinion of the Luxembourg Court, the legislative authority has set clear and precise limits in order to prevent such excessive blocking. In any case, a filter system that does not sufficiently distinguish between impermissible and permitted content is already incompatible with the right to freedom of expression and information. In its reform, the Union legislature also gave users of online platforms the right to fair use, for example through exceptions for the purpose of parody, caricature, or pastiche. For these rights to remain effective, sharing services should not pre-emptively block all content at all.

Online platforms are within the meaning of Art. 17 of Directive 2019/790 responsible for the illegal posting of protected works. However, providers are exempt from liability if they actively monitor uploaded content. This means that online platforms must use so-called upload filters that recognize protected works, i.e. prevent uploading of copyrighted content.

4.    Case C-682/18 and C-683/18

Since its inception, the Internet has challenged many basic principles of copyright. One of the most controversial issues related to the global digital network is the question of the liability of third-party intermediaries for copyright infringement by Internet users. The Court of Justice of the EU had the opportunity to rule on this issue in the joined cases C-682/18 and C-683/18.

In case C-682/18, YouTube users uploaded private recordings of concerts and music by British artist Sarah Brightman from the album “A Winter Symphony” to the portal. German music producer Frank Peterson, however, signed a worldwide exclusive contract with this artist in 1996 regarding the use of audio and video recordings of her concerts. Since the uploading of private recordings of concerts and music to YouTube was done without Peterson’s consent, the producer sought damages from YouTube as part of his lawsuit, because the uploaded content can be downloaded from this platform.

In another case (C-683/18), the Dutch specialist publisher Elsevier sued the Swiss company Cyando, which operates the file hosting and sharing platform “Uploaded” which can be accessed via the websites uploaded.net, uploaded.to and ul.to. This platform offers all Internet users free storage space for uploading files regardless of their content. YouTube users have uploaded the works “Gray’s Anatomy for Students”, “Atlas of Human Anatomy” and “Campbell-Walsh Urology” to this portal.

Both cases came before the Court of Justice of the EU. Namely, by initiating a preliminary decision procedure before this court, the German Federal Court wanted to clarify to what extent the operators of internet platforms are responsible if third parties upload works protected by copyright to such platforms without authorization.

The Luxembourg court examined the platform operator’s liability based on the standards that were relevant at the time of the uploads in question. Accordingly, the disputes are based on Directive 2001/29 on copyright, Directive 2000/31 on electronic commerce, and Directive 2004/48 on the enforcement of intellectual property rights. With these legal requirements, Luxembourg judges have concluded that providers such as YouTube are generally not liable for the conduct of their users unless they were aware of the illegal content. In that case, they would be obliged to delete or block the content.

The judgment in the joined cases C-682/18 and 683/18 refers to the legal situation before the introduction of the regime of special responsibility in Art. 17 of Directive 2019/790. Nevertheless, the judgment is significant because Art. 17 introduces direct liability only for a certain subset of online platforms, as its scope is limited to online content-sharing service providers. For platforms that host third-party content and that do not qualify as online content sharing service providers, the issue of liability under Art. 3 (1) of Directive 2001/29 and its interpretation in this judgment remain relevant. The decision may therefore have a direct impact on one of the parties involved, the file hosting service Uploaded. Uploaded probably won’t qualify as an online content-sharing service provider, either because it doesn’t make large amounts of copyrighted material available to the public or because it doesn’t compete with licensebased streaming services. The reasons for the judgment support the previous argument, as the Court considered that making available a large number of works uploaded by its users is not the main functionality of Uploaded.

5.    Conclusion

The Copyright Directive on the Digital Single Market was adopted in April 2019. Since the adoption of this Directive, its content, especially Art. 17, which regulates the liability of online platforms, has caused many controversies. This regulation prohibits online platforms from sharing and displaying unlicensed copyrighted content on behalf of users.

To avoid liability, online service providers must obtain authorization from rights holders and content creators. In the absence of such authorization, service providers will be held liable for infringing content unless they are able to demonstrate that (a) they have made every effort to obtain authorization from the rights holders (b) they have made sufficient efforts to ensure unavailability of specific works and other items for which the right holders have provided the service providers with the relevant and necessary information and (b) act expeditiously, after receiving sufficiently reasoned notice from the rights holders, to disable access to or remove from their websites, the reported works or other subject matter of protection.

Immediately after the adoption of the Directive, Poland filed a lawsuit for annulment of Art. 17 of the Directive before the Court of Justice of the EU, considering that this provision contradicts Art. 11 of the Charter of Fundamental Rights of the EU, which regulates the right to freedom of expression and information. Namely, in order to comply with Article 17, service providers must use tools that enable automatic pre-filtering of content. Poland considered that the imposition of such preventive monitoring measures on providers of content-sharing services on the Internet represented a restriction of the right to freedom of expression and information. Moreover, such measures would lead to excessive blocking of user content, which can only be reversed if the user in question decides to file a complaint and seek redress. The court, however, ultimately rejected Poland’s claim for annulment of Art. 17 of Directive 2019/790.

An upload filter refers to software that classifies and processes certain content. This software is tasked with stopping a specific violation of the law upon sufficiently specific indication of the violation and taking precautionary measures to ensure that no further similar violation of the law occurs. By entering appropriate search terms, the filtering software can find suspicious cases, which can then be manually checked if necessary. Manual checking of illegal content is expensive and technically impossible, because for example 400 hours of video material are uploaded to YouTube every minute. Due to the large amount of uploaded content, no other verification was possible.

Directive 2019/790 does not explicitly mention upload filters. However, based on the text of the Directive and the current state of the art, there is actually no alternative to their use.

Directive 2019/790, therefore, introduced novelties regarding the liability of some platforms in the European Union. Under the E-Commerce Directive’s safe harbor rules, intermediaries in the EU were protected from liability for the actions of their users committed through their services, provided they did not know about it. It is clear from the provisions of the E-Commerce Directive that intermediaries cannot be obliged to monitor all communications of their users and install general filtering mechanisms for this purpose. The EU Court of Justice has confirmed this in a number of cases, among other things, because filtering would limit the fundamental rights of platform operators and users of intermediary services. Twenty years later, the regime for online intermediaries in the EU has fundamentally changed with the adoption of Article 17 of Directive 2019/790. For certain categories of online intermediaries called “online content sharing providers” the uploading of infringing content by their users now results in direct liability and they are required to use “best efforts” to obtain authorization for such uploading. In addition, online content-sharing providers should use their best efforts to ensure that content for which they have not obtained authorization is not available on their services. It is not yet clear how online content-sharing providers can comply with this obligation. However, it seems inevitable that they will need to install measures such as automatic filtering (so-called

“upload filters”). Given the scope of the obligation, there is a real danger that the measures taken by online content sharing providers to fulfill their obligation will lead to expressly prohibited general monitoring. What seems certain, however, is that automated filtering, whether of a general or specific nature, cannot adequately distinguish between illegitimate and legitimate use of content (for example content covered by copyright restrictions).

Copyright regulations should strike a balance between the protection of rights holders, on the one hand, and the exercise of fundamental rights, such as freedom of expression, on the other. Regulations that impose obligations or incentives to filter, block, or monitor content on the Internet can impede the freedom to share information. Such obligations limit unauthorized but lawful use of copyrighted works, use authorized by open licenses, as well as the use of works in the public domain. On the other hand, such obligations may disadvantage small and nonprofit platforms that lack the resources to use tracking systems.

Although Poland’s claim for annulment of Art. 17. Directives rejected, member states can use the existing space for implementation. Directives. In particular, the hitherto non-legally binding recitals should be included in national regulations. Member States can make it clear here that certain platforms are exempt from the filtering obligation. Mandatory filtering for younger and financially weak platforms would create a barrier to their entry into the market. However, Directive 2019/790 aims to promote innovation in the Digital Single Market. Achieving these goals depends on many factors, especially the clarity of implementation.

Lučić Sonja

Pravni fakultet, Univerzitet u Kragujevcu, Srbija

Список литературы The operator’s liability for copyright violations committed by users of its platform

  • Aras, A. (2020). “Upload-Filter” – Die Urheberrechts-Reform der Europäischen Union, Brüssel, pp. 1–25
  • Becker, M. (2019). Von der Freiheit, rechtwidrig handeln zu können, Zeitschrift für Urheber- und Medienrecht – ZUM, 64 (8/9), pp. 636–648
  • Busch, C. (2019). Mehr Fairness und Transparenz in der Plattformökonomie? Die neue P2B-Verordnung im Überblick, Gewerblicher Rechtsschutz und Urheberrecht – GRUR, 8, pp. 788–797
  • Daum, F. (2019). Verantwortlichkeit von Online-Portalen nach Art 17 DSM-RL (Teil II), Medien und Recht – Zeitschrift für Medien- und Kommunikationsrecht, 6, pp. 283–291
  • Dreier, T. (2019). Die Schlacht ist geschlagen – ein Überblick. Zum Ergebnis des Copyright Package der EU-Kommission, Gewerblicher Rechtsschutz und Urheberrecht – GRUR, 8, pp. 771–779
  • Geiger, C., & Jütte, B.J. (2021). Platform liability under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match, Gewerblicher Rechtsschutz und Urheberrecht, Internationaler Teil – GRUR Int. 70 (6), pp. 517–544
  • Gielen, N. (2019). Tiessen Marten, Die neue Plattformhaftungv nach der Richtlinie über das Urheberrecht im digitalen Binnenmarkt, Europäische Zeitschrift für Wirtschaftsrecht – EuZW, 15, pp. 639–647
  • Hofmann, F. (2019). Fünfzehn Thesen zur Plattformhaftung nach Art. 17 DSM-RL; Gewerblicher Rechtsschutz und Urheberrecht – GRUR, 12, pp. 1219–1229
  • Hofmann, F., & Specht-Riemenschneider, L. (2021). Verantwortung von Online-Plattformen–Ein Plädoyer für ein funktionszentriertes Verkehrspflichtenkonzept. Zeitschrift für Geistiges Eigentum/Intellectual Property Journal, 13 (1), pp. 48–113
  • Ohly, A. (2015). Die Verantwortlichkeit von Intermediären, Zeitschrift für Urheber- und Medienrecht – ZUM, 59 (4), pp. 308–319
  • Quintais, J.P., & Angelopoulos, C. (2022). YouTube and CyandoJoined Cases C-682/18 and C-683/18, Auteursrecht, pp. 46–51
  • Raue, B. (2018). Meinungsfreiheit in sozialen Netzwerken,Juristenzeitung, 73 (20), pp. 961–970
  • Specht, L. (2017). Ausgestaltung der Verantwortlichkeit von Plattformbetreibern zwischen Vollharmonisierung und nationalem Recht, Zeitschrift für Urheber- und Medienrecht – ZUM, 2, pp. 114–123
  • Specht-Riemenschneider, L. (2020). Leitlinien zur nationalen Umsetzung des Art. 17 DSM-RL aus Verbrauchersicht, Downloaded 2021 January 3 https://www.vzbv.de/sites/default/files/downloads/2020/06/23/2020-06-12-specht-final-art_17.pdf, pp. 1–133
  • Wagner, G. (2020). Haftung von Plattformen für Rechtsverletzungen, Gewerblicher Rechtsschutz und Urheberrecht – GRUR, 5, pp. 447–457
  • Wandtke, A. (2019). Grundsätze der Richtlinie über das Urheberrecht im digitalen Binnenmarkt, Neue Juristische Wochenschrift – NJW, 26, pp. 1841–1847
  • Wandtke, A., & Hauck, R. (2019). Art. 17 DSM-Richtlinie – Ein neues Haftungssystem im Urheberrecht, Zeitschrift für Urheber- und Medienrecht – ZUM, 8-9, pp. 627–636
Еще
Статья научная