Coming into Force, not Coming into Effect?

The Impact of the German Implementation of Art. 17 CDSM Directive on Selected Online Platforms

Authors

  • Jasmin Brieske Goethe-University Frankfurt am Main
  • Alexander Peukert Goethe-University Frankfurt am Main

1 Introduction

The question of whether and under which conditions operators of user-generated content (UGC) sharing platforms such as YouTube are liable for copyright-infringing content uploaded by their users has been a classical problem of digital copyright for some 20 years.1 Under the E-Commerce and InfoSoc Directives of 2000 and 2001,2 platform operators were generally not directly liable for making available infringing UGC unless they actively contributed to these infringements.3 With art 17 CDSMD, 4 the EU legislator turned this concept on its head (or, depending on your perspective, on its feet): To adapt the existing Union copyright framework to rapid technological developments and new business models while keeping a high level of protection of copyright and related rights,5 online content-sharing service providers (OCSSPs) now perform an act of communication to the public or an act of making available to the public when they give the public access to copyright-protected works or other protected subject matter uploaded by their users unless they demonstrate they have made best efforts to obtain authorization, to filter content for which rightholders have provided information, and, at the very least, to take down content for which they have received a notice.6

With the proposition of comprehensive ex ante obligations of service providers, in particular, blocking unlawful content before it becomes available online, art 17 CDSMD forms part of a new generation of platform regulation, replacing the first generation of simple ex post takedown obligations. The Republic of Poland asked the CJEU to annul the provision in part or totally, based on the argument that arts 17(4)(b) and (c) CDSMD require OCSSPs to conduct preventive monitoring of all the content that their users wish to upload.7 In its judgment of April 26, 2022, the Grand Chamber of the CJEU agreed that art 17(4) entails a limitation on the exercise of the right to freedom of expression and information of users of content-sharing services,8 but it held that this limitation is, in light of the numerous safeguards built into the provision, justified.9

On May 31, 2021, and thus long before the CJEU’s decision was published, the German parliament passed an Act on the Copyright Liability of Online Content Sharing Service Providers (OCSSP Act),10 which went into force on August 1, 2021.11 With its unique system, in which ex ante duties to block unlawful content are inseparably intertwined with ex ante duties to avoid the unavailability of lawful user content, the OCSSP Act contains provisions that could pass as sufficient safeguard mechanisms within the meaning of the CJEU’s decision.

Article 17 CDSMD, its validity in view of fundamental rights, and the options for its implementation by EU Member States have been the subject of numerous legal studies.12 The same is true for the German OCSSP Act, particularly regarding analyses of the dangers and possibilities accompanying an increased use of automated content recognition tools.13

In addition, a growing body of research empirically studies the effects of public and private online platform rules. Most of these studies analyze, however, only the first generation of platform liability (i.e., the notice and takedown system with its ex post duties to safeguard user rights) and its compliance with, for example, the fair use doctrine of U.S. copyright law.14 Erickson and Kretschmer compiled and classified such studies according to their main focus of research.15 The authors identified key sub-fields of empirical research, including the functioning of takedown processes, the potential for over-enforcement or abuse, and the enforcement costs.16

Another important voice on the topic is Niva Elkin-Koren, who, together with Sharon Bar-Ziv, studied the interplay between notice and takedown proceedings and the freedom of expression of users by analyzing a large number of copyright removal requests addressed to Google Search.17 The issue of over-blocking was also at the heart of a study by Marc Liesching and co-authors on the practical effects of the original version of the German Network Enforcement Act, which required social networks to take prompt action against certain criminal speech of their users.18 Another paper compared platforms’ terms of use against hate speech with existing legal standards.19

In the area of copyright content moderation, two papers map out and discuss the rules and procedures of several mainstream, alternative and specialized platforms, including YouTube, Facebook, and Twitter, and how these rules changed over time.20 Our paper complements this state of legal and empirical research by using the enactment of the German OCSSP Act on August 1, 2021, as a natural experiment to test whether and how far the newly established second-generation copyright approach on platform liability impacted the terms and conditions of eight platforms (i.e., YouTube, Rumble, TikTok, Twitter, Facebook, Instagram, SoundCloud, and Pinterest). We wanted to find out whether the copyright-related terms and conditions of these services changed, and if so, on which platforms and to what extent. To our knowledge, no other paper has studied whether and how platforms put the novel, comprehensive approach of the German transposition into effect.

2 The law

2.1 Article 17 CDSMD: The new liability regime for OCSSPs

Before the enactment of the German OCSSP Act, German courts consistently held sharing platform operators to be only indirectly liable according to the principles of “disturber liability” (Störerhaftung). Under this regime, copyright holders who detected infringing uploads had to notify the platform operator of a clear infringement. This notice triggered duties with the platform operator, namely, to act expeditiously to delete the content in question or block access to it (takedown) and to ensure that such infringements do not recur (staydown). The takedown and staydown duties could be enforced with a court injunction. Damage claims against generally neutral service providers, in contrast, failed because these actors neither communicated protected content to the public themselves nor did they have knowledge of concrete infringements by their users.21

It was only in 2018 that the German Federal Court of Justice referred a number of questions concerning the compatibility of this liability regime with the E-Commerce and InfoSoc Directives to the CJEU for a preliminary ruling.22 In its judgment of June 22, 2021, on YouTube and Cyando, the Grand Chamber of the CJEU held that the operator of a video-sharing platform or a file-hosting and -sharing platform, on which users can illegally make protected content available to the public, generally does not make a “communication to the public” of that content under art 3 InfoSoc Directive unless it contributes, beyond merely making that platform available, to giving access to such content to the public in breach of copyright.23 That active contribution is present where (1) that operator has specific knowledge that protected content is available illegally on its platform and refrains from expeditiously deleting it or blocking access to it; (2) that operator, despite that it knows or ought to know, in a general sense, that users of its platform are making protected content available to the public illegally via its platform, refrains from putting in place the appropriate technological measures that can be expected from a reasonably diligent operator in its situation to counter credibly and effectively copyright infringements on that platform; or (3) that operator participates in selecting protected content illegally communicated to the public, provides tools on its platform specifically intended for the illegal sharing of such content or knowingly promotes such sharing, which may be attested by the fact that the operator has adopted a financial model encouraging users of its platform illegally to communicate protected content to the public via that platform.24 The Grand Chamber of the CJEU also confirmed that the German Störerhaftung is, in principle, compatible with EU law.25

In parallel to these developments in the courts, the EU legislator worked on and eventually enacted as part of the CDSMD a “new liability regime” for OCSSPs in cases where they make UGC publicly available online.26 An OCSSP is defined in art 2(6) CDSMD as a provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected content uploaded by its users, which it organizes and promotes for profit-making purposes.

According to art 17(1) CDSMD, an OCSSP performs an act of communication to the public or an act of making available to the public when it gives the public access to copyright-protected works or other protected subject matter uploaded by users. Therefore, operators of those services must obtain authorization from the rightholders under art 17(1). Absent an authorization, OCSSPs are consequently directly liable for illegal UGC unless they demonstrate that they have complied with three cumulative requirements set out in art 17(4) CDSMD. They accordingly must make best efforts to obtain authorization (para a), must make best efforts to ensure the unavailability of content for which rightholders have provided relevant and necessary information (para b), and in any event, they must take down notified content and make best efforts that it stays down (para c).

If the service provider fails to act accordingly, it is no longer only indirectly liable as a Störer (interferer) when it takes notice of a copyright infringement on its platform. Instead, the OCSSP performs an infringing act, which triggers full civil liability, including damages. This already complex regime is further specified and supplemented in subsections 5–9 of art 17 CDSMD. According to these provisions,

  • the cooperation between OCSSPs and rightholders “shall not result in the prevention of the availability” of lawful uploads, including where protected content is covered by an exception or limitation (sub-s 7);
  • OCSSPs must put in place an effective and expeditious complaint and redress mechanism available to their users in the event of disputes over the blocking of uploaded content (sub-s 9 subpara 1); and
  • OCSSPs shall inform their users in their terms and conditions that they can use works and other subject matter under exceptions or limitations to copyright and related rights provided for in Union law (sub-s 9 subpara 4).

Article 17 CDSMD eventually entered into force on June 6, 2019.27 Before that date, on May 24, 2019, the Republic of Poland filed an action under art 263 Treaty on the Functioning of the European Union (TFEU), asking the CJEU to annul arts 17(4)(b) and (c) CDSMD, and, in the alternative, should the court consider that those provisions cannot be severed from the other provisions of art 17 CDSMD without altering the substance thereof, to annul art 17 in its entirety.28 Poland’s key argument was that art 17 effectively forces OCSSPs to implement automatic filtering tools and that such a preventive review constitutes a particularly serious and disproportionate interference with the right to freedom of expression and information of OCSSP users.29

On April 26, 2022, the Grand Chamber dismissed the annulment action. Although it agreed that art 17 CDSMD establishes a de facto obligation of OCSSPs to use automatic recognition and filtering tools, which constitutes a limitation of freedom of expression under art 11 of the Charter attributable to the EU legislature,30 the Court nevertheless held that appropriate safeguards have accompanied this obligation to ensure respect for the right to freedom of expression and information of the users and a fair balance between that right, on the one hand, and the right to intellectual property, protected by art 17(2) of the Charter, on the other.31 Only some passages of the decision interpret segments of art 17 CDSMD. The CJEU states, for example, that OCSSPs cannot be required to filter out content whose unlawfulness would require an “independent assessment.”32 Hence, the Court recognizes that in some cases, unauthorized content can only be taken down ex post upon a specific notification of rightholders.33 Beyond that, the CJEU expressly leaves it to the Member States to transpose art 17 CDSMD in a way that allows a fair balance between the fundamental rights affected by the Directive.34 The implied leeway for national legislators creates the risk of significantly different implementations of art 17 CDSMD in EU Member States.

Irrespective of this still dynamic legal development surrounding art 17 CDSMD, a mainstream view on how to put a fair balance between copyright and freedom of expression and information on OCSSP platforms under conditions of algorithmic content moderation into practice has emerged. As the European Commission’s guidance on art 17 CDSMD,35 the Opinion of Advocate General Saugmandsgaard Øe,36 and not least the CJEU judgment in the Polish annulment action, reveal there is now a relatively solid consensus that:

  • Articles 17(4)(b) and (c) CDSMD effectively force OCSSPs, in many cases, to put in place automatic content recognition and moderation tools to prevent the communication to the public of content for which rightholders have provided “relevant and necessary” information (indexed content);37
  • no currently available technology can assess to the standard required by law as to whether UGC is infringing or lawful;38
  • and for this reason, there is a real risk of false positives, also known as over-blocking.39

Under these conditions, art 17(7) CDSMD and its interpretation as a whole in light of freedom of expression suggest, according to the Commission, AG Saugmandsgaard Øe, and the CJEU judgment, that:

  • an ex post complaint and redress mechanism (art 17(9) CDSMD) that restores blocked but legitimate content is insufficient;40
  • upload filtering must be limited to “manifestly” infringing UGC, which does not require an “independent assessment” of the content, whereas “ambiguous” content must not be subject to preventive blocking measures;41
  • criteria to distinguish between manifest infringements and ambiguous cases include the length/size of the UGC, the proportion of the content identified as matching indexed files in relation to the entire upload, and the level of modification of the work.42

The Commission and the Advocate General mention several concrete tools to put the abovementioned basic approaches into concrete action. Whereas users should be able to pre-flag content to trigger a manual check,43 according to the Commission, rightholders should be empowered to block “earmarked content” that is particularly time-sensitive.44 Finally, mechanisms to mitigate the risks of misuse of such procedures should be put in place.45

2.2 The German OCSSP Act

EU Member States were required to bring into force provisions necessary to comply with the CDSMD by June 7, 2021.46 Germany was among the few Member States to meet the transposition deadline,47 except for the OCSSP Act, which transposed art 17 CDSMD. It entered into force a bit late, on August 1, 2021.48 The German OCSSP Act deserves particular attention because of its original and elaborate approach to avoid disproportionate blocking (“over-blocking”) by automated upload filters. The solutions adopted by the German parliament on May 31, 2021, anticipated much of the debate on the EU level and in other Member States. The Act acknowledges the necessity, but also the limits and thus dangers, of filtering technologies.49 In response, the Act introduces a new category of “uses presumably authorized by law” – namely, uses permissible under any statutory limitation to copyright (ss 44a–63a German Copyright Act) – that an OCSSP must, in principle, communicate to the public (s 9(1)). According to s 9(2), this rebuttable presumption of lawfulness concerns UGC that:

1)contains less than half of one or several other works or entire images,

2)combines this third-party content with other content, and

3)uses the works of third parties only to a minor extent (s 10) or, in the alternative, is flagged by the user as legally authorized (s 11).

“Minor uses” are defined in s 10 as uses that do not serve commercial purposes or only serve to generate insignificant income and concern up to 15 seconds of a cinematographic work or moving picture, up to 15 seconds of an audio track, up to 160 characters of a text, and up to 125 kilobytes of a photographic work, photograph, or graphic (e.g., memes). The flagging option comes into play where UGC exceeds these limits but still possibly qualifies for a limitation or exception because it combines images or less than half of indexed content with other non-indexed content into, for example, a remix or mashup (ss 9(2) nos 1 and 2). OCSSPs must implement the flagging options during the upload of new content by the user. If the upload matches an indexed reference file submitted by a rightholder and would thus be blocked from being communicated to the public, OCSSPs must inform the user and enable the user to flag the use as authorized by law under any statutory limitation. Both minor and pre-flagged UGC trumps upload filters and thus will go online.

At the same time, the act obliges OCSSPs to inform a rightholder about the availability on their sites of minor or pre-flagged UGC containing parts of their indexed works (s 9(3)). A rightholder may then initiate the regular internal complaints procedure, which might lead within one week to an ex post takedown and the future blocking of the upload in dispute (ss 14(1)–(3)). In addition, OCSSPs must provide “trustworthy” rightholders with a “red button” procedure aimed at premium content, such as movie blockbusters. If following a review by a natural person such a rightholder declares that a certain minor or pre-flagged use substantially impairs the economic exploitation of their work, the OCSSP is obliged to immediately block the uploaded content until the conclusion of the complaints procedure (s 14(4)).50

In sum, the German OCSSP Act establishes a complex legal framework that aims to translate the traditional legal structure of exclusive rights, limitations, exceptions, and remedies into a digital realm where most decisions are taken and enforced automatically by content recognition technology that produces false negatives and false positives.51 To balance exclusivity and access under these rough algorithmic conditions, the Act distinguishes between the following situations:

  • Presumably illegal UGC that must be blocked: Uploads containing only indexed content or, if indexed content is combined with non-indexed content, the upload contains half or more of an indexed work.52
  • Presumably legal UGC that must go online: Uploads combining minor parts of indexed works with non-indexed content.
  • UGC in a gray area that combines non-indexed content with up to 50% of indexed works will be blocked unless the user flags it as lawful (green flag).
  • Minor or pre-flagged combinations of indexed content with non-indexed content are to be blocked immediately if a trustworthy rightholder presses the red button.

In this concept, machines will take most decisions seamlessly (only indexed or combined content y/n; minor use y/n; </> 50 % of indexed content). In individual cases, humans may correct the algorithmic outcome with pre-flagging or a red button declaration. OCSSPs must sanction misuse of these options by excluding the users or rightholders from the respective procedures for an appropriate period (ss 18(1)(3) and (5)).

2.3 Effects of the German OCSSP Act: Data and methodology

In the end, however, the proof is in the pudding. As also recognized by the CJEU, the effectiveness of the Directive and its transposition largely depends on the OCSSPs. To find the right balance between the conflicting fundamental rights, the service providers must have leeway to determine which specific measures they take to achieve the result sought. Accordingly, OCSSPs can choose to put in place the measures best adapted to the resources and abilities available to them and compatible with the other obligations and challenges they will encounter in the exercise of their activity.53 At the same time, this room to maneuver does not dispense OCSSPs from respecting the strict requirements of art 17 CDSMD.

Against this background, we wanted to find out whether the publicly accessible copyright policies on the German language websites54 of eight services changed upon the coming into force of the German OCSSP Act on August 1, 2021. These services are YouTube, Rumble (a smaller platform with similar functionality), TikTok, Twitter, Facebook, Instagram, SoundCloud, and Pinterest. The selection of these eight services is based on the premise that all are arguably covered by the OCSSP Act because they store and give the public access to a large amount of copyright-protected content uploaded by their users, which they organize and promote for profit-making purposes.55 At the same time, they differ in size, content focus (video, audio, picture, text), and general functionality.56

We copied and saved 514 documents containing terms and conditions, general community and copyright guidelines, complaint forms, FAQs, and other relevant copyright help pages.57 Furthermore, the process of uploading content was documented through screenshots of websites containing public statements of OCSSPs on their platforms explaining their mode of operation and specific functionalities to users.58

The data collection was conducted four times in 2021, specifically July 20–30, August 1, August 2–20, and November 16–22. Of the 514 documents, 163 were dated before August 1, 101 from August 1, 92 from a date shortly after the enactment, and 158 from November. The Internet Archive’s Wayback Machine was used to access the past status of websites if these were initially missed. A list of the source documents for each service and time of data collection has been added as an annex. The data are available upon request.

For each service and each point in time, the source data were analyzed as to whether they implemented the following six mandatory duties of OCSSPs under the German Act:

1)Ensure “qualified blocking” (upload filters), s 7(1).59

2)Inform users about all statutory limitations and exceptions under German law, ss 5(1) and (3).60

3)Enable pre-flagging of lawful content, s 11(1) no 3.61

4)Make an internal complaints procedure available, ss 14(1)–(3), (5).62

5)Implement a red button solution for trustworthy rightholders, s 14(4).63

6)Exclude rightholders from the automated blocking and red button procedures and users from the pre-flagging option in cases of misuse, ss 18(1), 18(3) no 1 and (5).64

2.4 Findings

Our data collection revealed only a few changes in the terms and conditions of platforms over time but significant differences between the services.

Most changes concerned the most contentious and possibly the most important feature of art 17 CDSMD and the German OCSSP Act: the duty of OCSSPs to implement preventive upload filters “as soon as the rightholder so requests and provides the information required for such purpose”, s 7(1) (“qualified blocking,” mandatory duty no (1)). Except for Twitter, all platforms mentioned the option to submit copyrighted materials for subsequent automated copyright content moderation before August 1, 2021. TikTok appeared to be in the process of establishing upload filters during the time of study. Its website mentioned such an option and offered a link to a corresponding form, which was inaccessible; at best, an error message appeared when we tried to access the form on August 19. The form was accessible on November 21, 2021. According to the website, TikTok enables the rightholder to request that works protected by copyright are not available on TikTok in the EU in accordance with art 17 CDSMD.

During the final round of data collection in November, we also observed changes on the copyright help pages of YouTube, Facebook, and Instagram. YouTube expanded the availability of the Copyright Match Tool, which had been available already in July 2021, for every user of the platform. In contrast, Content ID has been and remains restricted to rightholders and copyright management providers submitting numerous takedown notices.65 With the expansion of the Copyright Match Tool, YouTube increased its compliance with the requirements of s 7(1) German OCSSP Act. Facebook and Instagram introduced Brand Rights Protection, formerly known as Commerce & Ads IP Tool, as further protection for IP rights in addition to the Rights Manager.

Regarding the duty of services to inform users about lawful uses (2), we did not observe any relevant changes. Facebook, Instagram, TikTok, and SoundCloud listed some specific exceptions and limitations – namely quotations, criticism, reviews, caricatures, parodies, and pastiches – either as part of their terms of service or on additional help pages. TikTok provided additional explanations on these uses and cautioned that Member States could provide further exceptions. YouTube referred broadly to exceptions and limitations under EU law without further specification and in parenthesis. None of the services referred to the chapter of the German Copyright Act on exceptions and limitations (ss 44a–63a).

The possibility of flagging content as legally authorized before upload (3) was not clearly laid out on any website at any point in time. According to YouTube’s help page, uploaders are immediately informed about the results of a pre-filtering process. In the event of a match with copyright-relevant material, the user has the opportunity to assert the legality of the content or otherwise edit it, with the option of filing a complaint immediately if a copyright complaint was reported during the preliminary check.66 The extent to which such an assertion by the user ensures an immediate upload could not be evaluated. Facebook pointed out that there is an option to confirm the authorized use but, according to Facebook’s guidelines, only after the content has been removed.67

Internal complaints procedures (4) were available on all services before the OCSSP Act came into force. The only changes in this context were observed between the third and fourth rounds of data collection. These changes concerned Twitter’s Copyright Policy and the move from submitting a counter-notification via a form rather than a separate email. At the same time, Facebook established its Transparency Center. These observations are subject to the caveat that we did not run test uploads and therefore were unable to evaluate how effectively and expeditiously the procedures are carried out.

No public statements explaining the red button option for trustworthy rightholders were observed on any platform at any point in time (5). Only Facebook vaguely indicated that the more tools in the FB Rights Manager are used, the more options are unlocked. This lack of observation may be because the respective options are only explained after signing up for the copyright protection programs.

Regarding sanctions for the misuse of copyright procedures (6), most services referred to the liability for misrepresenting copyright infringements under s 512f of the U.S. Digital Millennium Copyright Act (DMCA). Only YouTube and Facebook made general statements regarding the exclusion from YouTube’s Content ID or Facebook’s Rights Manager. However, according to the wording (“can”), these are voluntary measures at the provider’s discretion. Whether this suffices to comply with the mandatory misuse sanctions under the OCSSP Act is questionable. Instagram merely announced that misleading or fraudulent reporting of copyright or trademark infringement could lead to action on the part of the platform. We did not observe any rules about excluding users who misuse the pre-flagging option. This result is in line with our finding that no such pre-flagging option existed in the first place.

Overall, the only relevant changes in the four rounds of data collection concerned the availability of preventive, automated copyright moderation tools on YouTube, TikTok, Facebook, and Instagram (“qualified blocking,” also known as “upload filters” according to s 7(1) German OCSSP Act) and minor adjustments to the existing complaints procedures on Twitter and Facebook. In addition, and beyond the scope of our study, we noticed that Facebook and Instagram changed their terms of service by August 1, 2021, to the effect that a person authorized to receive service in Germany for the purposes of the OCSSP Act (s 20) was announced.68 This is the only context in which the German OCSSP Act was explicitly referenced on any website studied.

The following table summarizes the levels of compliance and relevant changes observed based on public statements of the service providers covered. The green, yellow, and red dots represent full, partial, or no compliance, respectively. Regarding the duty of qualified blocking in accordance with s 7(1) German OCSSP Act, the table visualizes the implementation or expansion of the existing automatic filter systems.

“Qualified blocking” (upload filters)

Information about all limitations and execptions

Pre-flagging option

Internal complaints procedure

Red Button solution

Misuse measures

YouTube

Rumble

Twitter

Facebook

Instagram

Tik Tok

SoundCloud

Pinterest

2.5 Summary and discussion

Our study of public statements on the websites of eight service providers that arguably are subject to the German OCSSP Act provides answers to both research questions: the evolution of terms and conditions (1) over time and (2) as compared between different services.

First, the Act’s entry into force on August 1, 2021, did not immediately result in any changes or additions to the terms and conditions. Insofar as we were able to observe compliance, it was established before August 1, 2021, and thus possibly anticipated by service providers.69 Only during the last round of data collection in mid-November 2021 did we observe relevant changes. These concerned preventive, automated copyright moderation tools (YouTube, TikTok, Facebook, Instagram) and the complaint mechanism (Twitter, Facebook). However, until November 22, no service had fully complied with the six statutory obligations studied.

Second, the level of compliance of the eight services covered, both immediately before and after the enactment, varies according to the duties and services in question. According to the analyzed documents, internal complaints procedures are in place on all platforms, and upload filters are on most of them. About half of all services provide some information about limitations and exceptions. Misuse sanctions were only observed on YouTube, Facebook, and Instagram. Compliance with the two more recent legal-technological procedures concerning the handling of upload filters (i.e., the pre-flagging and red button options) could not be clearly established for any service. The compliance score ranking based on the terms and conditions and other public statements of the OCSSPs for November 2021 looks like this:

1)Facebook: 2 green, 4 yellow, no red

2)YouTube: 2 green, 3 yellow, 1 red

3)Instagram: 2 green, 2 yellow, 2 red

4)TikTok: 2 green, 1 yellow, 3 red

5)SoundCloud: 1 green, 2 yellow, 3 red

6)Pinterest/Rumble: 1 green, 1 yellow, 4 red

7)Twitter: 1 green, no yellow, 5 red

These findings are subject to important limitations. First, data collection occurred only at four points in time between July 20 and November 22, 2021. Our results thus present a snapshot of platform terms and conditions just before and after the enactment of the German OCSSP Act rather than a long-term study of the evolution of these private rules.70

Second, we neither tested the copyright procedures by uploading content or filing complaints nor contacted the service providers for further explanation of their copyright policies.71 It might be that content moderation practices function differently than what has been publicly announced or that functionalities are available that have not been publicly communicated. This could be the case, particularly for the red button solution. In other words, our study is limited to what is happening on the surface of terms and conditions and other public statements. These documents should, however, not least from a legal perspective, provide a transparent and accurate description of the functioning of the services.72 Without respective information about, for example, an available red button or pre-flagging option, rightholders and users of the services will be unable to act accordingly and make full use of the law. Consequently, de facto compliance with the OCSSP Act on a technical level would appear to be an insufficient implementation of the law.

With these caveats in mind, our study provides several interesting insights into the copyright practice of eight UGC platforms after the enactment of the new and special liability regime for OCSSPs under EU law came into force in Germany. First, large platforms with a clear exposure to copyright infringements (YouTube, Facebook/Instagram) display a higher compliance score than more recently established (TikTok) and comparatively smaller content-sharing platforms (Rumble, Pinterest, SoundCloud). One reason for this observation might be that some providers consider themselves as falling within the start-up or small service provider categories that are exempted from automatic filtering and corresponding duties.73 More plausible, however, is that our results reflect the greater financial and technological ability of Big Tech companies to implement new regulatory duties. If that is true, the study confirms the thesis that platform regulations might reify the market dominance of Big Tech. Twitter, finally, might score low because copyright infringements do not pose its prime moderation challenge, it considers itself not covered by art 17 CDSMD because it does not “compete with online content services for the same target groups,” or both.74

Second, regarding the low (if not zero) immediate impact of the German OCSSP Act on the platforms’ copyright policies, our study highlights the challenge of putting the second-generation platform regulations into effect, which establish both ex ante blocking and ex ante carrying obligations. It furthermore exemplifies the weaknesses and difficulties of legal harmonization via directives in general and art 17 CDSMD in particular. That provision leaves it to the Member States and the Commission (via art 17(10) CDSMD) to determine the detailed rules for establishing an adequate balance between exclusivity and access on sharing platforms.75 The Commission published its Guidance only three days before the end of the transposition period, and most Member States had not yet concluded their legislative procedures.76

Last, at the time of our study, the Polish annulment action continued to hang like a sword of Damocles over art 17 CDSMD.77 Under these uncertain conditions, it is not surprising that providers of global online services with a place of establishment outside the EU adopted a wait-and-see attitude until a solid, EU-wide consensus on how to put art 17 CDSMD into platform practice has emerged. Rather than splitting up their services into 27 versions compliant with national OCSSP Acts, they risk running afoul of, say, German law.

In that risk assessment, the sanctions for a failure to comply are key. On this enforcement level, another weakness or imbalance of both art 17 CDSMD and the German OCSSP Act comes to light. The failure to implement sufficient copyright moderation practices results in full civil liability – including the duty to pay damages. This may explain the observed changes towards preventive copyright moderation tools for all rightholders on TikTok, YouTube, Facebook, and Instagram.

In contrast, failure to protect legitimate user interests incurs limited legal consequences. Article 17 CDSMD is silent on this latter issue, and the German OCSSP Act only entitles “user associations” to claim injunctive relief against a service provider who repeatedly and wrongly blocks authorized uses.78 The whole regime regarding “uses presumably authorized by law” (i.e., minor or pre-flagged uses) is not coupled with a specific enforcement regime at all, and it is doubtful whether general tort law can fill the void.79 Thus, the much-praised German OCSSP Act might turn out to be a toothless tiger with respect to users’ interests. Legal provisions to their benefit have not yet left a mark on the terms and conditions of the services studied. The lesson taught by the deficits of the German OCSSP Act is that effective enforcement of user interests, including through public law sanctions,80 is crucial to achieving a meaningful and balanced result.81

Our study thus reveals ample room and need for further research on the legal and practical aspects of second-generation platform regulation, namely laws that require an ex ante blocking mechanism (in contrast to the conventional ex post notice and takedown system) on the one hand while also proactively guaranteeing the exercise of user rights on the other hand. First, there is a lack of legal doctrinal research into enforcing user freedoms/rights under conditions of algorithmic content moderation. The procedures and the legal basis on which lawful speech should be enabled ex ante and, if necessary, enforced against false algorithmic blocking decisions ex post are yet to be determined. Further, it remains unclear who should be entitled to initiate respective procedures and whether user rights can be collectivized, and if yes, how.

Second, empirical studies should be undertaken that test our finding that platform rules and procedures in favor of user freedoms/rights are insufficient. The question must be asked how respective procedures work and if they provide an “effective and expeditious complaint and redress mechanism”82 — under the premises that such respective procedures are available at all. If they are not, this finding should be documented and evaluated by future research too.

Date received: January 2022

Date accepted: October 2022

Annex

Saved documents from 20 – 30 July 2021

YouTube

Nutzungsbedingungen

Regeln und Richtlinien

  • Urheberrecht
    • Überblick
    • Fair Use
    • Ansprüche erheben
    • Durchsetzung von Urheberrechten
  • Wie werden Urheberrechte auf YouTube geschützt

YouTube-Hilfe

  • Richtlinien, Sicherheit und Urheberrecht
    • Urheberrecht und Verwaltung von Rechten
  • Melden von Inhalten und Richtliniendurchsetzung
    • Inhalte melden
    • YouTube Trusted Flagger-Programm
  • Rechtliche Richtlinien
    • Andere rechtliche Probleme
  • Videos hochladen
  • Wie können wir dich bei der Verwaltung von Urheberrechten unterstützen?

[Complaint form:] Entfernung des Videos beantragen

[Upload:] Videos hochladen [excerpt]

Rumble

Website Terms and Conditions of Use and Agency Agreement (English)

Copyright Infringement Notification (English)

[Upload:] Upload, share and license your videos (English)

Twitter

Twitter Allgemeine Geschäftsbedingungen

Hilfe-Center

  • Twitter Regeln und Richtlinien
  • Richtlinie zum Urheberrecht
  • Allgemeine Empfehlungen und Richtlinien [excerpts]
  • Richtlinien für Strafverfolgungsbehörden
    • Häufig gestellte Anfragen zu rechtlichen Anfragen
  • Parody, newsfeed, commentary, and fan account policy (English)

[Complaint form:] Hilfe bei Fragen zu geistigem Eigentum

Facebook

Nutzungsbedingungen

  • Gemeinschaftsstandards
  • V. Wahrung des geistigen Eigentums
  • 24. Geistiges Eigentum

Hilfebereich

  • Richtlinien und Meldungen
  • Geistiges Eigentum
  • Urheberrecht

Rights Manager (English)

Facebook for Business

  • Hilfebereich für Unternehmen
  • Grundlagen
  • Übereinstimmende Videos / Übereinstimmungen
  • Anfechtungen / Anfechtungen und Konflikte
  • Referenzsammlung / Referenzdateien
  • Regeln für Übereinstimmungen
  • Eigentümer-Links
  • Insights
  • Instagram
  • Entfernungen
  • Monetarisierung

[Complaint form:] Meldeformular Urheberrechte

Instagram

Nutzungsbedingungen

Hilfebereich

  • Richtlinien und Meldungen
    • Gemeinschaftsrichtlinien
  • Geistiges Eigentum
    • Urheberrecht

[Complaint form:] Meldeformular Urheberrechte

TikTok

Endnutzer Lizenzvereinbarung und allgemeine Geschäftsbedingungen (Nutzungsbedingungen)

Community-Richtlinien

Rechtliches

  • Regeln zum geistigen Eigentum
  • Zusätzliche Bestimmungen für Nutzer mit Wohnsitz in der Bundesrepublik Deutschland

[Complaint form:] Melden unangemessener Inhalte

[Complaint form:] Counter Notification Form (English)

SoundCloud

Allgemeine Nutzungsbedingungen von SoundCloud

Community-Richtlinien

Informationen zum Urheberrecht

SoundCloud Help Center

  • Urheberrecht
  • Entfernung von Tracks [excerpts]
  • Urheberrechtsrichtlinien von SoundCloud [excerpts]

Meldung von Urheberrechtsverletzungen

[Complaint form:] Urheberrechtsverletzung melden

Pinterest

AGB

Copyright

AGB des Content-Claiming-Portals von Pinterest

Help Center

  • Recht
  • Urheberrechtsverwaltung
  • Copyright
  • Erste Schritte beim Content-Claiming-Portal

[Complaint form:] Anzeige einer Urheberrechtsverletzung

Saved documents from 1 August 2021

YouTube

Nutzungsbedingungen

Regeln und Richtlinien

  • Urheberrecht
    • Überblick
    • Fair Use
    • Ansprüche erheben
    • Durchsetzung von Urheberrechten
  • Wie werden Urheberrechte auf YouTube geschützt

YouTube-Hilfe

  • Richtlinien, Sicherheit und Urheberrecht
    • Urheberrecht und Verwaltung von Rechten
  • Melden von Inhalten und Richtliniendurchsetzung
    • Inhalte melden
    • YouTube Trusted Flagger-Programm

[Upload:] Video hochladen [excerpt]

Rumble

Website Terms and Conditions of Use and Agency Agreement (English)

Copyright Infringement Notification (English)

[Upload:] Upload, share and license your videos (English)

Twitter

Twitter Allgemeine Geschäftsbedingungen

Hilfe-Center

Twitter Regeln und Richtlinien

  • Richtlinie zum Urheberrecht
  • Allgemeine Empfehlungen und Richtlinien [excerpts]
  • Verstöße melden
  • Parody, newsfeed, commentary, and fan account policy (English)

[Complaint form:] Hilfe bei Fragen zu geistigem Eigentum

Facebook

Nutzungsbedingungen

Gemeinschaftsstandards

  • V. Wahrung des geistigen Eigentums
  • 24. Geistiges Eigentum

Hilfebereich

  • Richtlinien und Meldungen
  • Wie melde ich Etwas?
  • Geistiges Eigentum
    • Urheberrecht

Rights Manager (English)

Facebook for Business

  • Hilfebereich für Unternehmen
  • Grundlagen
    • Rightsmanager
  • Übereinstimmende Videos
    • Match Rules in Rights Manager

[Complaint form:] Meldeformular Urheberrechte

[Screenshot front page]

Instagram

Nutzungsbedingungen

Hilfebereich

  • Richtlinien und Meldungen
  • Gemeinschaftsrichtlinien
  • Wie melde ich Etwas?
  • Geistiges Eigentum
    • Urheberrecht

[Complaint form:] Meldeformular Urheberrechte

[Screenshot front page]

TikTok

Endnutzer Lizenzvereinbarung und allgemeine Geschäftsbedingungen (Nutzungsbedingungen)

Community-Richtlinien

Rechtliches

  • Regeln zum geistigen Eigentum
  • Zusätzliche Bestimmungen für Nutzer mit Wohnsitz in der Bundesrepublik Deutschland
  • Autorisierung zum Nutzen von Inhalten bei TikTok

[Complaint form:] Melden unangemessener Inhalte

[Complaint form:] Counter Notification Form (English)

[Upload:] Video hochladen

SoundCloud

Allgemeine Nutzungsbedingungen von SoundCloud

Community-Richtlinien

Informationen zum Urheberrecht

SoundCloud Help Center

  • Urheberrecht
  • Entfernung von Tracks [excerpts]
  • Urheberrechtsrichtlinien von SoundCloud
    • Wie Urheberrechtsverletzungen vermieden werden können

Meldung von Urheberrechtsverletzungen

[Complaint form:] Urheberrechtsverletzung melden

Pinterest

AGB

Copyright

AGB des Content-Claiming-Portals von Pinterest

Help Center

  • Recht
  • Urheberrechtsverwaltung
  • Copyright
  • Erste Schritte beim Content-Claiming-Portal

[Complaint form:] Anzeige einer Urheberrechtsverletzung

[Form:] Zugriff auf das Content-Claiming-Portal beantragen

Saved documents from 2 – 20 August 2021

YouTube

Nutzungsbedingungen

Regeln und Richtlinien

  • Urheberrecht
  • Überblick
  • Fair Use
  • Ansprüche erheben
  • Durchsetzung von Urheberrechten

YouTube-Hilfe

  • Richtlinien, Sicherheit und Urheberrecht
  • Urheberrecht und Verwaltung von Rechten

[Complaint form:] Entfernung des Videos beantragen

[Upload:] Video hochladen [excerpt]

Rumble

Website Terms and Conditions of Use and Agency Agreement (English)

Copyright Infringement Notification (English)

[Upload:] Upload, share and license your videos (English)

Twitter

Twitter Allgemeine Geschäftsbedingungen

Hilfe-Center

  • Twitter Regeln und Richtlinien
  • Richtlinie zum Urheberrecht
  • Allgemeine Empfehlungen und Richtlinien
    • Richtlinie zur angemessenen Nutzung

[Complaint form:] Hilfe bei Fragen zu geistigem Eigentum

[Upload:] Video hochladen

Facebook

Nutzungsbedingungen

Gemeinschaftsstandards

  • V. Wahrung des geistigen Eigentums
    • 24. Geistiges Eigentum
  • VI. Inhaltsbezogene Anfragen und Entscheidungen
    • 27. Oversight Board

Hilfebereich

  • Richtlinien und Meldungen
  • Geistiges Eigentum
  • Urheberrecht

[Complaint form:] Meldeformular Urheberrechte

[Upload]

Instagram

Nutzungsbedingungen

Hilfebereich

  • Richtlinien und Meldungen
  • Gemeinschaftsrichtlinien
  • Geistiges Eigentum
    • Urheberrecht

[Complaint form:] Meldeformular Urheberrechte

TikTok

Endnutzer Lizenzvereinbarung und allgemeine Geschäftsbedingungen (Nutzungsbedingungen)

Community-Richtlinien

Rechtliches

  • Regeln zum geistigen Eigentum
  • Autorisierung zum Nutzen von Inhalten bei TikTok

[Complaint form:] Counter Notification Form (English)

[Upload:] Video hochladen

[Form to submit copyrighted materials – error message]

SoundCloud

Allgemeine Nutzungsbedingungen von SoundCloud

Community-Richtlinien

Informationen zum Urheberrecht

SoundCloud Help Center

  • Urheberrecht
  • Entfernung von Tracks [excerpts]
  • Schutz meiner Inhalte auf SoundCloud
    • Melden einer Verletzung deiner Tracks
  • Urheberrechtsrichtlinien von SoundCloud
    • Copyright-Methoden und Benachrichtigungen

Meldung von Urheberrechtsverletzungen

[Complaint form:] Urheberrechtsverletzung melden

[Upload:] Hochladen

Pinterest

AGB

Copyright

AGB des Content-Claiming-Portals von Pinterest

Help Center

  • Recht
  • Urheberrechtsverwaltung
  • Copyright
  • Erste Schritte beim Content-Claiming-Portal

[Complaint form:] Anzeige einer Urheberrechtsverletzung

[Upload:] Pin erstellen

Saved documents from 16 – 22 November 2021

YouTube

Nutzungsbedingungen

Regeln und Richtlinien

  • Urheberrecht
  • Überblick
  • Fair Use
  • Ansprüche erheben
  • Durchsetzung von Urheberrechten

YouTube-Hilfe

  • Richtlinien, Sicherheit und Urheberrecht
  • Urheberrecht und Verwaltung von Rechten

[Complaint form:] Entfernung des Videos beantragen [excerpt]

[Upload:] Video hochladen [excerpt]

Rumble

Website Terms and Conditions of Use and Agency Agreement (English)

Copyright Infringement Notification (English)

[Upload:] Upload, share and license your videos (English)

Twitter

Twitter Allgemeine Geschäftsbedingungen

Hilfe-Center

  • Twitter Regeln und Richtlinien
  • Richtlinie zum Urheberrecht
  • Platform Use Guidelines
    • Richtlinie zur angemessenen Nutzung

[Complaint form:] Hilfe bei Fragen zu geistigem Eigentum

Facebook

Nutzungsbedingungen

Plattformnutzungsbedingungen von Facebook

Transparency Center (Meta)

Richtlinien

  • Facebook-Gemeinschaftsstandards
    • Geistiges Eigentum
  • Weitere Richtlinien
  • So wird Facebook immer besser [excerpts]

Durchsetzung

  • Ermittlung von Verstößen [excerpts]
  • Ergreifen von Maßnahmen [excerpts]

Oversight Board

Daten

  • Bericht zur Durchsetzung der Gemeinschaftsstandards (English)
  • Geistiges Eigentum (English)
  • Behördenanfragen nach Nutzerdaten (English)
  • Sperrung von Inhalten auf der Grundlage des vor Ort geltenden Rechts – Germany (English)

[Accessed via Google search:] Gemeinschaftsstandards

  • V. Wahrung des geistigen Eigentums
    • 23. Geistiges Eigentum
  • VI. Inhaltsbezogene Anfragen und Entscheidungen
    • 27. Oversight Board

Hilfebereich

  • Richtlinien und Meldungen
  • Geistiges Eigentum
  • Urheberrecht

[Complaint form:] Meldeformular Urheberrechte

[Upload]

Instagram

Nutzungsbedingungen

Hilfebereich

  • Richtlinien und Meldungen
  • Wie melde ich Etwas?
  • Antrag auf Entfernen von Inhalten aufgrund von Rechtsverstößen [error message]
  • Gemeinschaftsrichtlinien
  • Geistiges Eigentum
    • Urheberrecht

[Complaint form:] Meldeformular Urheberrechte

TikTok

Endnutzer Lizenzvereinbarung und allgemeine Geschäftsbedingungen (Nutzungsbedingungen)

Community-Richtlinien

Rechtliches

  • Regeln zum geistigen Eigentum
  • Autorisierung zum Nutzen von Inhalten bei TikTok

[Complaint form:] Counter Notification Form (English)

[Complaint form:] Report copyright infringement (English)

[Form to submit copyrighted materials:] Anfrage um das Erscheinen Ihrer urheberrechtlich geschützten Werke auf TikTok in der EU zu verhindern

SoundCloud

Allgemeine Nutzungsbedingungen von SoundCloud

Community-Richtlinien

Informationen zum Urheberrecht

SoundCloud Help Center

  • Urheberrecht
  • Entfernung von Tracks [excerpts]
  • Urheberrechtsrichtlinien von SoundCloud
    • Copyright-Methoden und Benachrichtigungen
  • Melden eines Konflikts bei der Inhaberschaft
  • Meldung bei SoundCloud
  • What about fair use or copyright exceptions? (English)

Meldung von Urheberrechtsverletzungen

[Complaint form:] Urheberrechtsverletzung melden

[Upload:] Hochladen

Pinterest

AGB

Copyright

[Accessed via Google search:] AGB des Content-Claiming-Portals von Pinterest

Help Center

  • Recht
  • Urheberrechtsverwaltung
  • Copyright
  • Erste Schritte beim Content-Claiming-Portal

[Complaint form:] Anzeige einer Urheberrechtsverletzung

[Form:] Zugriff auf das Content-Claiming Portal beantragen

[Upload:] Pin erstellen


1 See Giancarlo Frosio (ed), The Oxford Handbook of Online Intermediary Liability (OUP 2020); Martin Husovec, Injunctions against Intermediaries in the European Union: Accountable but Not Liable? (CUP 2017); Daniel Holznagel, Notice and Take-Down-Verfahren als Teil der Providerhaftung (Mohr Siebeck 2013); Irini A Stamatoudi (ed), Copyright Enforcement and the Internet (Wolters Kluwer 2010).

2 European Parliament and Council Directive (EC) 2000/31 of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market [2000] OJ L 178/1 (E-Commerce Dir); European Parliament and Council Directive (EC) 2001/29 of 22 May 2001 on the harmonisation of certain aspects of copyright and related rights in the information society [2001] OJ L 167/10 (InfoSoc Dir).

3 C-682/18 and C-683/18 YouTube and Cyando [2021] ECLI:EU:C:2021:503, paras 102, 117, 118.

4 European Parliament and Council Directive (EU) 2019/790 of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC [2019] OJ L 130/92 (CDSMD).

5 ibid Recital 3.

6 cf arts 17(1) and (4) CDSMD.

7 C-401/19 Poland / Parliament and Council [2022] ECLI:EU:C:2022:297, para 24.

8 ibid para 58.

9 ibid paras 59 et seq.

10 Available in English at www.bmj.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/UrhDaG_ENG.html accessed 4 July 2022 (all quotes taken from that unofficial translation).

11 Articles 3 and 5 2nd sentence Gesetz zur Anpassung des Urheberrechts an die Erfordernisse des digitalen Binnenmarktes vom 31.5.2021, Bundesgesetzblatt 2021 I, 1204.

12 See, eg Christophe Geiger and Bernd Justin Jütte, ‘Platform Liability under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match’ [2021] GRUR Int 517; European Copyright Society, ‘Comment on Selected Aspects of Implementing Article 17 of the Directive on Copyright in the Digital Single Market Into National Law’ (2020) https://europeancopyrightsocietydotorg.files.wordpress.com/2020/04/ecs-comment-article-17-cdsm.pdf accessed 25 April 2022; Louisa Specht-Riemenschneider, ‘Leitlinien zur nationalen Umsetzung des Art. 17 DSM-RL aus Verbrauchersicht’ (2020) www.vzbv.de/sites/default/files/downloads/2020/06/23/2020-06-12-specht-final-art_17.pdf accessed 25 April 2022; João Pedro Quintais and others, ‘Copyright Content Moderation in the EU: An Interdisciplinary Mapping Analysis’ (2022) https://ssrn.com/abstract=4210278 accessed 7 November 2022.

13 Franz Hofmann, ‘Das neue Urheberrechts-Diensteanbieter-Gesetz’ [2021] NJW 1905; Artur Wandtke and Ronny Hauck, ‘Verantwortlichkeit und Haftung – Das Urheberrechts-Diensteanbieter-Gesetz im Kontext des allgemeinen Urheberrechts’ [2021] ZUM 763; Marcus von Welser, ‘Plattformhaftung nach dem Urheberrechts-Diensteanbieter-Gesetz (UrhDaG)’ [2021] GRUR-Prax 463. For an outline of the provisions in English, see Matthias Leistner, ‘The Implementation of Art. 17 DSM-Directive in Germany – A Primer with Some Comparative Remarks’ (2022) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3989726 accessed 25 April 2022. See also the commentaries on the OCSSP Act, Jan Oster and Ulrike Grübler, in Hartwig Ahlberg, Horst-Peter Götting and Anne Lauber-Rönsberg (eds), BeckOK Urheberrecht (36th edn, C.H.Beck 2022); Benjamin Raue and Louisa Specht-Riemenschneider, in Thomas Dreier and Gernot Schulze (eds), Urheberrechtsgesetz (7th edn, C.H.Beck 2022); Jan Eichelberger, in Jan Eichelberger, Thomas Wirth and Fedor Seifert (eds), Urheberrechtsgesetz (4th edn, Nomos 2022).

14 cf Daniel Seng, ‘Copyrighting Copywrongs: An Empirical Analysis of Errors with Automated DMCA Takedown Notices’ (2015) https://ssrn.com/abstract=2563202 accessed 24 April 2022; Jennifer M Urban, Joe Karaganis and Brianna L Schofield, ‘Notice and Takedown in Everyday Practice’ (2017) https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2755628 accessed 25 April 2022.

15 Kris Erickson and Martin Kretschmer, ‘Empirical Approaches to Intermediary Liability’ in Giancarlo Frosio (ed), The Oxford Handbook of Online Intermediary Liability (OUP 2020) 104. Similarly, see also Daphne Keller and Paddy Leerssen, ‘Facts and Where to Find Them: Empirical Research on Internet Platforms and Content Moderation’ in Nathaniel Persily and Joshua A Tucker (eds), Social Media and Democracy. The State of the Field, Prospects for Reform (CUP 2020) 220.

16 Erickson and Kretschmer (n 15) 107.

17 Sharon Bar-Ziv and Niva Elkin-Koren, ‘Behind the Scenes of Online Copyright Enforcement: Empirical Evidence on Notice & Takedown’ (2017) 50 Con L Rev 339.

18 Marc Liesching and others, Das NetzDG in der praktischen Anwendung. Eine Teilevaluation des Netzwerkdurchsetzungsgesetzes (Carl Grossmann 2021).

19 Paolo Cavaliere, ‘Digital Platforms and the Rise of Global Regulation of Hate Speech’ (2019) 8(2) CILJ 282.

20 Quintais and others (n 12), 185–289 (data collection ended 2020); Péter Mezei and István Harkai, ‘Self-regulating Platforms? – The Analysis of the Enforcement of End-User Rights in the Light of the Transposition of Article 17 of the CDSM Directive’ (2022) 7(1) PGAF L Rev 109.

21 BGH GRUR 2018, 1132, ECLI:DE:BGH:2018:130918BIZR140.15.0 paras 46–52.

22 ibid 1132.

23 YouTube and Cyando (n 3) para 102.

24 ibid paras 76 et seq.

25 ibid para 143.

26 Poland / Parliament and Council (n 7) para 20.

27 CDSMD, art 31.

28 Poland / Parliament and Council (n 7) para 1.

29 ibid para 24.

30 ibid paras 53 et seq.

31 ibid paras 59 et seq.

32 ibid para 90.

33 ibid para 91.

34 ibid para 99.

35 Commission, ‘Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market’ (Communication) COM (2021) 288 final.

36 C-401/19 Poland / Parliament and Council [2021] ECLI:EU:C:2021:613, Opinion of Advocate General Saugmandsgaard Øe.

37 Poland / Parliament and Council (n 7) paras 53–58; COM (2021) 288 final (n 35) 11–13; Opinion of AG Saugmandsgaard Øe (n 36) paras 57–69.

38 COM (2021) 288 final (n 35) 20; Opinion of AG Saugmandsgaard Øe (n 36) para 67.

39 Poland / Parliament and Council (n 7) para 93 (“in cases where, notwithstanding the safeguards laid down in those latter provisions, the providers of those services nonetheless erroneously or unjustifiably block lawful content”); COM (2021) 288 final (n 35) 13; Opinion of AG Saugmandsgaard Øe (n 36) paras 141–48.

40 COM (2021) 288 final (n 35) 20; Opinion of AG Saugmandsgaard Øe (n 36) paras 180 et seq; Quintais and others (n 12) 124.

41 Poland / Parliament and Council (n 7) para 90; COM (2021) 288 final (n 35) 21; Opinion of AG Saugmandsgaard Øe (n 36) paras 198 et seq, 205–06. The distinction between likely (later “manifestly”) infringing and likely legitimate uploads was first mentioned in a consultation paper published by the Commission in July 2020; see Commission, ‘Targeted consultation addressed to the participants to the stakeholder dialogue on Article 17 of the Directive on Copyright in the Digital Single Market’ (2020) https://digital-strategy.ec.europa.eu/en/news/directive-copyright-digital-single-market-commission-seeks-views-participants-stakeholder-dialogue accessed 20 January 2022.

42 COM (2021) 288 final (n 35) 21; Opinion of AG Saugmandsgaard Øe (n 36) paras 202–03.

43 cf COM (2021) 288 final (n 35) 22 (feedback from users); Opinion of AG Saugmandsgaard Øe (n 36) para 211 with further references.

44 cf COM (2021) 288 final (n 35) 14, 22 (e.g., pre-released music or films or highlights of recent broadcasts of sports events); Opinion of AG Saugmandsgaard Øe (n 36) para 223 (a simple assertion of a risk of significant economic harm by rightholders does not justify preventive blocking unless the content is manifestly infringing).

45 COM (2021) 288 final (n 35) 22.

46 CDSMD, art 29(1).

47 cf CREATe, ‘Copyright in the Digital Single Market Directive – Implementation. An EU Copyright Reform Resource’ www.create.ac.uk/cdsm-implementation-resource-page/ accessed 20 January 2022; Commission, ‘Copyright: Commission calls on Member States to comply with EU rules on copyright in the Digital Single Market’ (2021) https://ec.europa.eu/commission/presscorner/detail/en/MEX_21_3902 accessed 20 January 2022.

48 Article 5 2nd sentence Gesetz zur Anpassung des Urheberrechts an die Erfordernisse des digitalen Binnenmarktes, Bundesgesetzblatt 2021 I, 1204.

49 German Government Draft Bill, Bundestags-Drucksache 19/27426, 137, 139

50 ibid 144.

51 On the functionality of automated content recognition technology, see Robert Gorwa, Reuben Binns and Christian Katzenbach, ‘Algorithmic content moderation: Technical and political challenges in the automation of platform governance’ (2020) 7(1) Big Data & S; Benjamin Raue and Martin Steinebach, ‘Uploadfilter – Funktionsweisen, Einsatzmöglichkeiten und Parametrisierung’ [2020] ZUM 355.

52 Except for images, which may be used in their entirety, s 9(2) 2nd sentence OCSSP Act.

53 Poland / Parliament and Council (n 7) para 75.

54 The only exception is Rumble, which does not provide a German-language website but is accessible for German users nonetheless.

55 CDMSD, art 2(6); OCSSP Act, ss 2 and 3.

56 Regarding the size of the platforms, the range varies between services with higher and smaller user numbers. For example, in the impact assessment of the CDSMD, the EU Commission refers to 1.3 billion users of YouTube as of October 2015, which equals 33% of all internet users (cf Commission, ‘Commission Staff Working Document Impact Assessment’ SWD (2016) 301 final, 138 fn 407), and thus YouTube is considered “clearly the biggest service,” 152 fn 466. For comparison, according to the assessment of the EU Commission, SoundCloud has approximately 250 million registered users, while Pinterest reported more than 100 million monthly active users in 2015, ibid 138 fn 407.

57 We thus adopted a broad understanding of “terms and conditions” as defined in art 3(u) European Parliament and Council Directive (EU) 2022/2065 of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC [2022] OJ L 277/1 (Digital Services Act) (“‘terms and conditions’ means all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the service”); see also Quintais and others (n 12) 20, 186–89.

58 Other sources of information not covered by this study are transparency reports, information shared with institutions such as the Lumen Database, notices of blocked content to affected users, governmental disclosures, third-party audits, and leaked information. See Keller and Leerssen (n 15) 227 et seq.

59 The provision reads, “Service providers are obliged, in accordance with section 1(2), to ensure, as far as possible, by blocking or removal (blocking) that a work is not communicated to the public and will in future not be available for this purpose, as soon as the rightholder so requests and provides the information required for such purpose.”

60 The provision reads, “(1) The communication to the public of copyright-protected works and parts of works by the user of a service provider is authorised for the following purposes: 1. quotations in accordance with section 51 of the Copyright Act, 2. caricatures, parodies and pastiches in accordance with section 51a of the Copyright Act, and 3. other cases of communication to the public authorised by law and the reproduction required for such purpose in accordance with Part 1 section 6 of the Copyright Act. (3) Service providers must, in their general terms and conditions, draw the user’s attention to the uses authorised by law referred to in subsection (1).”

61 The provision reads, “(1) If user-generated content is to be blocked automatically when being uploaded and does not constitute minor use as per section 10, service providers are obliged […] 3. to enable the user to flag the use as authorised by law pursuant to section 5.”

62 The provision reads, “(1) Service providers must make available to users and rightholders an effective, free and expeditious complaints procedure in respect of the blocking and the communication to the public of protected works. (2) Complaints must be substantiated. (3) Service providers are obliged to immediately 1. notify the complaint to all the parties involved, 2. give all the parties involved the opportunity to comment, and 3. decide on the complaint, at the latest within one week after its submission. […] (5) Decisions on complaints must be made by impartial natural persons.”

63 The provision reads, “If, following a review by a natural person, a trustworthy rightholder declares that the presumption under section 9(2) is to be rebutted and that the continued communication to the public substantially impairs the economic exploitation of the work, the service provider is, in derogation of section 9(1), obliged to immediately block the work up until the conclusion of the complaints procedure.”

64 The provision reads, “(1) If an alleged rightholder repeatedly requests that the service provider block a work belonging to a third party as the rightholder’s own work or a work in the public domain, the service provider must exclude the alleged rightholder from the procedures under sections 7 and 8 for an appropriate period of time. (3) If a rightholder repeatedly and wrongly demands 1. the immediate blocking of uses presumably authorised by law during the complaints procedure referred to in section 14 (4) […] then the rightholder must be excluded from the relevant procedure for an appropriate period of time. (5) If a user repeatedly and wrongly flags a use as authorised by law, service providers must exclude the user, for an appropriate period of time, from the possibility of flagging authorised uses.”

65 See the eligibility criteria for the Copyright Match Tool in the recently published Copyright Transparency Report for July–December 2021, 1, 4 https://storage.googleapis.com/transparencyreport/report-downloads/pdf-report-22_2021-7-1_2021-12-31_en_v1.pdf accessed 2 November 2022. For eligibility, the creator must be in the YouTube Partner Program or demonstrate a short history of takedowns, meaning the submission of a valid copyright removal request. Only the webform is open to everyone. Content ID, on the other hand, is accessible for smaller creators via a number of “service providers“ designated to claim the rights of others through the system, 3.

66 See also the details on the process in YouTube’s Copyright Transparency Report. Here, YouTube vaguely indicates that a Content ID claim does not necessarily lead to the blocking of the content if the user files a dispute, ibid, 10 et seq.

67 See also Quintais and others (n 12) 281–83 (Rights Manager’s ex post dispute procedure).

68 Facebook, ‘Nutzungsbedingungen’ www.facebook.com/terms?ref=pf accessed 20 January 2022; Instagram, ‘Nutzungsbedingungen’ https://help.instagram.com/581066165581870 accessed 20 January 2022.

69 On the problem of anticipation, see Roberto Poli, ‘The Many Aspects of Anticipation’ (2010) 12(3) Foresight 7.

70 See, in this regard, Quintais and others (n 12) 185–259; see also the Platform Governance Archive (PGA), Alexander von Humboldt Institute for Internet and Society https://pga.hiig.de accessed 20 January 2022.

71 cf Keller and Leerssen (n 15) who provide an overview of studies where the researchers either reached out to and interviewed content moderators or platform employees about the platform policies or conducted test trials and experiments.

72 cf arts 3 and 5 with Council Directive 93/13/EEC of 5 April 1993 on unfair terms in consumer contracts [1993] OJ L 95/29.

73 cf art 17(6) CDSMD and ss 2(2)–(4) with ss 7(4) and (5) OCSSP Act. All services are available in the EU for more than three years and probably have an EU turnover of more than 1 million euros.

74 OCSSP Act, s 2(1) no 4.

75 Poland / Parliament and Council (n 7) para 99.

76 See n 47; Quintais and others (n 12) 181–84 (sensible differences in the implementation of art 17 CDSMD).

77 cf COM (2021) 288 final (n 35) 1 (“The guidance may need to be reviewed following that judgment”).

78 OCSSP Act, s 18(6).

79 German Civil Code, s 823(2) (breach of a statute that is intended to protect another person).

80 cf Digital Services Act, art 49 et seq.

81 Quintais and others (n 12) 301.

82 CDSMD, art 17(9) first sentence.

keywords

CDSM directive

German OCSSP act

copyright

intermediaries

platform regulation

Metrics

Metrics Loading ...

Downloads

Published

21-12-2022

Issue

Section

Research Papers