Fitting Platforms into a new liability system – The results of the CJEU YouTube/Uploaded Case, Article 17 DSM/UrhDaG and the DSA

by on 23. May 2022

We are currently witnessing fundamental changes in Europe regarding the liability and the obligations of platforms for copyright infringements resulting from various parallel developments.

1. the European Court of Justice decision on the YouTube/Uploaded cases on 22nd June 2021 (C-682/18 and C-683/18);

2. national (here: German) legislation implementing Article 17 DSM-Directive (DIRECTIVE (EU) 2019/790), which has come into force 1st August 2021 in a completely new Act of law: the Urheberrechtsdiensteanbietergesetz, the Copyright service provider Act (short UrhDaG);

3. the Digital Services Act (as agreed upon by the EU Parliament 23rd April 2022);

4. the decision in the CJEU case Poland vs Article 17 on 26 April 2022 (C-401/19).

This article spends some thought on the relationship of such concepts and its interdependence.

First: On the Youtube/Uploaded decision of the CJEU

As a recap: The central question the court had to decide was whether a video sharing platform (here YouTube) and a file sharing platform (here Uploaded) are communicating the content, that was initially uploaded by their users, to the public themselves and whether the platforms are therefore liable to cease and desist as well as to pay damages to the rightholders.

The European Court had ruled on the interpretation of the concept of communication to the public within the meaning of Article 3 InfoSoc Directive many times before, and also in this new decision the Court reinforces the broad sense of this concept in order to establish a high level of protection to safeguard the interests and fundamental rights of the copyright holder on the one hand (Article 17 (2) CFREU) and the interest and fundamental rights of the users in their freedom of expression and of information (Article 11 CFREU).

The Court repeats its language on several complementary criteria that have to be taken into account and emphasizes again the criteria of the indispensable role and the deliberate nature of its intervention.

So what is new in this decision?

Regarding the platforms in the case, YouTube and Uploaded, the Court confirms the indispensable role such platforms have when its users make potentially illegal content available [CJEU C-682/18 and C-683/18, para. 77]. This criterion seems to be easy to determine for the Court.

The second criterion, the deliberate nature of its intervention, an act of intervening in full knowledge of the consequences of doing so, with the aim of giving the public access to protected works, is of course harder to be evaluated, as it is a subjective element that must be determined by drawing conclusions from objective factors characterising the situation, allowing an assumption on the subjective element.

In the Pirate Bay decision (C-610/15) the determining factor was that the operator had made explicit the purpose to make protected content available to the users and encouraged the users to make copies [CJEU C-682/18 and C-683/18, para. 82].

In the GS Media decision (C-160/15) the determining factor was the initiative to set a link to specific content, only in this circumstance it is relevant whether the posting of that link is carried out for profit [CJEU C-682/18 and C-683/18, paras. 86-88].

For the determination of the relevant factors for video streaming or file sharing platforms, the Court has now established four groups of relevant aspects, which are actually new [CJEU C-682/18 and C-683/18, para. 84]:

1. the fact that an operator who knows or ought to know, in a general sense, that users of its platform are making protected content available to the public illegally via its platform, refrains from putting in place the appropriate technological measures that can be expected from a reasonably diligent operator in its situation in order to counter credibly and effectively copyright infringements on that platform,

(Example: This could be the case when platforms receive large numbers of Take Down Notices, but do not take measures which go beyond the take down to prevent illicit content to flow in, like implementing a filter system for identification of such content)

2. the circumstance that that operator participates in selecting protected content illegally communicated to the public,

(Example: This could be the case if certain pieces of content are requested for upload to the user community.)

3. that the operator provides tools on its platform specifically intended for the illegal sharing of such content or

(Example: This could be the case if illegal sharing in facilitated by ‘one click’ uploads or the offer of a wider distribution.)

4. that it knowingly promotes such sharing, which may be attested by the fact that that operator has adopted a financial model that encourages users of its platform illegally to communicate protected content to the public via that platform.

(Example: This could be the case if a platform is providing kick-back payments for specific uploads; but also if the uploaders receive other kind of encouragement like specific attention of other users)

In addition, the court repeats that it would be a relevant factor if the operator, despite having been warned by the rightholder that specific protected content is being communicated illegally to the public via its platform, refrains from expeditiously taking the measures necessary to make that content inaccessible. This is however merely clarifying the situation where an operator does not act upon a take down notice [CJEU C-682/18 and C-683/18, para. 85].

The Court also holds that it would certainly also be a relevant factor if the main or predominant use of the platform consists of content made available to the public illegally, however the court makes clear in its wording that such percentage is not a precondition of a deliberate action. [CJEU C-682/18 and C-683/18, para. 100, 101].

The evaluation whether one of those criteria is met is a matter for the national court [para. 90] (here: the Federal Court of Justice in Germany, which will provide a decision on 2nd June 2022), but the CJEU provides some guidance for the cases presented:

  • For YouTube the CJEU tends not to see a liability, as it is not involved in the posting and selection of content. The process is automated without viewing or monitoring [para. 92]. Terms of use/community rules do not allow copyright infringement [para. 93]. YouTube also takes multiple technical measures (e.g. Content ID) [CJEU C-682/18 and C-683/18, para. 94]. The creation of rankings and rubrics are irrelevant [CJEU C-682/18 and C-683/18, para. 95]. In particular, the business model is not built primarily on unauthorized sharing of copyright protected content [CJEU C-682/18 and C-683/18, para. 96].
  • For Uploaded/Cyando, the CJEU rather indicates liability. It is true that here, too, there is no creation, selection, viewing or checking of content by the operator [para. 97], and here, too, copyright infringements are prohibited in the Terms and Conditions and the users alone decide whether the individual link is shared (and thus whether the content is published at all) [CJEU C-682/18 and C-683/18, para. 98]. However, either a very high proportion of infringing content or that the financial model adopted by the operator is based on the availability of illegal content on its platform and is designed to encourage its users to share such content via the platform [para. 101] would lead an intervention in full knowledge and therefore to public communication by the operators themselves.

From those outlines regarding the specific platforms in the case, we also have a couple of criteria, which are obviously not decisive for or against a liability under Article 3 InfoSoc Directive in the view of the CJEU:

1. Terms of use, which prohibit copyright infringements;

2. Automated process of upload and publication of content;

3. Last decision is with the user;

4. Creation of rankings and rubrics;

5. Promotion for profit

It is remarkable that those factors are very similar to the factors which define an Online Content Sharing Service Provider (an OCSSP) according to Article 2 No.6 of the DSM Directive and its national implementations (see eg sec. 2 para. 1 UrhDaG) and which do lead to a communication to the public according to Article 17 Abs. 1 DSM Directive (sec. 1 para. 1 UrhDaG) and to a liability only in case certain obligations are not met (see Article 17 para. 4 DSM Directive, sec. 1 para. 2 UrhDaG).

This brings us to the second concept of liability of platforms.

Second: Article 17 DSM Directive and its national implementation; here: German UrhDaG

According to section 2 para. 1 UrDaG the new law applies specifically to services that

1. have as their main purpose, exclusively or at least in part, the storage and making available to the public of a large amount of copyright-protected content uploaded by third parties,

2. organise content within the meaning of no. 1,

3. advertise content within the meaning of no. 1 for the purpose of making a profit, and

4. compete with online content services for the same target groups.

For those services the possibility to escape from liability for the public communication of content uploaded by their users exists in case they follow a specific set of obligations according to sections 4 and 7 to 11 UrhDaG, which result from the potential risk of their services.

The obligations of the platforms however go hand in hand with obligations of rightholders, which must cooperate with the platform operators in order to minimize copyright infringements, by offering or negotiating a license or providing the necessary and relevant information to the platform in order to prevent infringements. This new cooperative approach seems feasible to make such platforms and mass communication possible at all and thereby granting the public, the users, their fundamental right to freedom of expression and of information (Article 11 CFREU).

However, the interesting question remains, what about services that fulfil one or more of the factors which lead to a liability under Article 3 InfoSoc Directive and at the same time fall under the definition of OCSSPs? Would those also be able to escape liability by fulfilling the obligations under Article 17 DSM Directive/sec. 1 para. 2 UrhDaG? Even if they induce users to upload protected content to make it available in an infringing way?

The law is clear for services whose main purpose is to participate in copyright infringements or to facilitate those. Those services clearly cannot rely on the liability exemption under Article 17 para. 4 DSM Directive/sec. 1 para. 2 UrhDaG. [Recital 62 DSM; Sec. 1 para. 4 UrhDaG].

But what about services which do get involved more in the content offering than just fulfilling the OCSSP criteria. Which are setting up a business model that is inducing infringements, which provide tools for illegal sharing, and which participate in the selection of content. Should those profit from the liability exceptions?

The recitals in the Directive do not explicitly speak to this question, but Recital 64 DSM Directive says: This  does  not  affect  the  concept  of  communication  to  the  public  or  of  making  available  to  the public  elsewhere under  Union  law,  nor  does  it  affect  the  possible  application  of  Article  3(1)  and  (2)  of  Directive 2001/29/EC to other service providers using copyright-protected content. Those “other service providers” are certainly the services which do not fulfil the criteria of an OCSSP like pure intermediary services, those which are excluded from the application of the law in Article 2 para. 6 DSM implemented in Section 3 UrhDaG, which refers to non-profit wikipedias, university repositories, open source software and alike and those which have piracy as a main purpose. But what about the platforms which have a large corpus of illegal works on its platform and facilitate and encourage further infringements actively. Should those benefit from the liability exceptions in case they also fall under the definition of OCSSPs, or could those platforms qualify as “other service provider” facing a liability under Article 3 InfoSoc?

The grounds in the German law say BT-Drs. 19, 27426, page. 130: “The regulation has in large parts the character of a lex specialis to Article 3 para. 1 InfoSoc Directive with the consequence that for the liability of service providers in the sense of the DSM Directive as well as for their users predominantly the regulations in Article 17 DSM are decisive, but not Article 3 (1) or Article 5 InfoSoc Directive.”

This wording suggests that the new law might not be conclusive and that there might be room for a liability according to Article 3 (1) InfoSoc. Furthermore, the DSM clearly states that the objectives and the principles laid down by the Union copyright framework remain sound (Recital 3 DSM), referring to the statement in the InfoSoc Directive that “Any harmonisation of copyright and related rights must take as a basis a high level of protection.” (Recital 9 InfoSoc Directive). A new legislation like the DSM and its national implementations must therefore always seek a high level of protection, and the protection cannot be lowered by such new law.

Third: The Digital Services Act (DSA)

On 23 April 2022 the European Parliament agreed to the Digital Services Act, which is – amongst other issues – harmonizing the Safe harbor provisions set out in Article 14 E-Commerce Directive. The conditional exemption from liability under the DSA however is only available to ‘pure’ intermediary service providers, which are defined by their mere technical, automatic, and passive nature, and does neither apply to OCSSP Platforms nor to platforms which are communicating to the public in the sense of Article 3 InfoSoc Directive.

Fourth: The CJEU case Poland vs Article 17 DSM Directive on 26 April 2022 (C-401/19)

In the case Poland vs Article 17 DSM Directive (C-401/19) the CJEU confirms that Article 17 DSM Direcitve is valid and can be transposed in national law in a way which allows a fair balance to be struck between the various fundamental rights protected by the Fundamental Rights Charter.

The liability system

The following could be a picture of the different categories of platforms and the consequences for a communication to the public (CTTP), a liability and the application of safe harbor laws:

whereas Art. 3 is referring to Artikel 3 InfoSoc Directive; Art. 14 is referring to Artikel 14 E-Commerce Directive; Art. 17 is referring to Artikel 17 DSM Directive.

In many situations it might be clear in which category the platform in question is falling, however as platform models are very different from one another and as functionalities are changed easily with such platforms and could be turned on and off, determination could be difficult in individual cases and also, the categories might also change over time. This might make it necessary to find a new approach when addressing and enforcing infringements on platforms. In case the OCSSP criteria are fulfilled, rightholders as well as platforms will both have to work towards each other in a cooperative approach, as otherwise a court might find that the platform falls under the exception of Article 17 para. 4 DSM Directive/sec. 1 para. 2 UrhDaG or under the service provider privileges in the DSA, unless a further deliberate nature to communicate illicit content to the public can be proven. In this case, liability under Article 3 InfoSoc Direcitve could still apply – without the possibility to escape liability. Neither the grounds in the recitals of the DSM Directives or the UrhDaG nor the justification of the CJEU decision Poland v. Article 17 DSM Directive (C-401/19) explicitly exclude such liability if the additional conditions are met. Rather, it is clear from the legislative justifications and also from the C-401/19 decision that rightholders are to be afforded a high level of protection and that the new regulatory framework is intended to facilitate the enforcement of rights. The negotiating position of rights holders is to be improved by the new legislation. An exclusion of the already existing liability under Article 3 InfoSoc Directive would run counter to this.