The Digital Service Act: New obligations for hosting service providers

by on 13. December 2023

The Digital Service Act (DSA) is designed to comprehensively restrict the distribution of illegal content in digital services (we wrote about it here). Hosting service providers in particular have a number of obligations, in addition to other services.

Important: The deadline for implementation of these obligations is 17 February 2024.

 

1. What services are “hosting service providers”?

A “hosting service” is a service “consisting of the storage of information provided by, and at the request of, a recipient of the service” (Article 3(g)(iii) DSA). “Service” is defined as an “Information Society service, that is to say, any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services” (Article 3(a) DSA in conjunction with Article 1(b) of Directive (EU) 2015/1535).

Hosting includes all services that store content, regardless of whether that content is accessible to the public or to individual third parties. All providers which store content of any kind for third parties are therefore subject to the following catalogue of obligations.

 

2. Specific obligations of hosting service providers

Articles 11 to 18 of the DSA set out the obligations of hosting service providers:

These are summarised below, and details of each point can be found below.

  • Designation of single points of contact for authorities and recipients of the service (section 3.1.);
  • Transparency in the terms and conditions: explanation of content moderation in the service (section 3.2.);
    Content moderation means the control of content on the service, e.g. the establishment of guidelines as to what content may not be published, either because it is illegal (= prohibited content) or because it incompatible with the terms and conditions of a service (see e.g. nudity guidelines at Meta).
    In particular, it should explain what content is prohibited/unwanted in the service and how prohibited/unwanted content will be dealt with (blocking, deletion, demonetisation, suspension of user accounts, etc.);
  • Annual publication of transparency reports on the website on the content moderation that has taken place (section 3.3.);
  • An obligation to notify suspicions of crimes involving a threat to the life or safety of a person or persons (section 3.4.);
  • Establishing notification and action procedures for reporting illegal content stored on the Service (section 3.5.);
  • Creating templates for fulfilling the obligation to provide a statement of reasons for content moderation decisions (section 3.6.).

 

3. The individual obligations:

3.1 Designation of single points of contact

According to Art. 11 and 12 DPA, single points of contact” must be designated for direct electronic communication with

  • the national authorities, the EU Commission and the Digital Services Board; and
  • the recipients of the service, allowing them to choose the means of communication, which shall not solely rely on automated tools; providing only a chatbot would not be enough.

The information on the two single points of contact must be “easily accessible”, i.e. preferably in the legal notice together with “the details of the provider, which allow him to be contacted rapidly and communicated with in a direct and effective manner” (Section 5(1)(2) Telemediengesetz, which implements Article 5 of Directive 2000/31/EC), i.e. an e-mail address or a contact form.

The single point of contact for authorities must also indicate the languages in which the authorities can communicate (here German as the official language in Germany and in addition a language broadly understood by the largest possible number of EU citizens, e.g. English).

3.2 Transparency regarding the moderation of content in the terms and conditions

Service providers must state in their terms and conditions how they deal with illegal content and content that is incompatible with their terms and conditions.

This includes the following information:

  • Policies and procedures with regard to illegal content or content that is incompatible with the terms and conditions;
  • Possible sanctions, such as deletion/blocking of content or suspension/exclusion of users;
  • Whether illegal/incompatible content is assessed by automated tools or by human review;
  • The internal complaint-handling system (see “Internal Complaint-Handling” below).

If the service is “primarily directed at minors or is predominantly used by them”, the explanations must be given in a way that minors can understand. It is therefore useful to provide a special section in the terms and conditions that is written in child-friendly language.

3.3 Publication of transparency reports

At least once a year, provider have to publish a transparency report on “any content moderation that they engaged in during the relevant period” in an easily accessible way in a machine-readable format (e.g. on the website), covering the following points:

  • Number of orders received from authorities;
  • Number of notices concerning content;
  • Content moderation engaged in at the providers’ own initiative, with details of the tools and staff used and the measures taken;
  • Information on complaints against decisions, decisions taken, median time needed for taking those decisions and the number of instances where those decisions were reversed; where applicable, information on the use of automated tools with details of the accuracy and the possible rate of errors.

The EU Commission is expected to provide templates for transparency reports in implementing acts.

3.4 Obligation to notify criminal offences

In the case of “information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons has taken place, is taking place or is likely to take place“, a provider is required to promptly inform the law enforcement or judicial authorities of the Member State concerned (where the offence is suspected to have taken place, to be taking place or to be likely to take place, where the suspected offender resides or is located, or where the victim of the suspected offence resides or is located) or (if the Member State concerned cannot be identified with certainty) a police station or a public prosecutor’s office in Germany or Europol.

Note: This only applies to criminal offences that the service provider becomes aware of in the course of content moderation, e.g. through notices from other recipients of the service – there are no proactive obligations to seek facts or circumstances indicating illegal activity (Art. 8 DSA).

3.5 Establishment of notice-and-action procedures

The service provider must put mechanisms in place to allow any person or entity to notify them of illegal content in an easily accessible and user-friendly manner, exclusively by electronic means.

3.5.1 Requirements for reports

The service provider must facilitate the submission of substantiated notices containing the following elements:

  • Reasons why the content is alleged to be illegal;
  • indication of the exact electronic location of the content (e.g. URL);
  • name and email address of the reporting person/entity;
  • A statement confirming the bona fide belief of the individual or the entity submitting the notice that the information and allegations contained therein are accurate and complete.

Providers must therefore either define the fields in a contact form or explain these requirements when providing an email address for notifying content.

3.5.2 Response to the notifying individual/entity

If the person/entity submitting the report has provided an e-mail address, they must receive an confirmation of receipt without undue delay. They must also be informed without undue delay of the decision taken.

3.6 Establishing a “Statement of Reasons” process

When content provided by a recipient of the service is restricted or the service is suspended/terminated, or the recipient of the service’s account is suspended/terminated, a statement of reasons for that decision must be provided.

According to the law, these reasons must include the following:

  • Whether content is removed, blocked, downgraded or demoted its visibility is restricted;
  • The factual basis for the decision, including information on whether the decision was based on voluntary own-initiative investigations by the provider or taken pursuant to a notice from a third party;
  • Where applicable, information about automated decision-making;
  • The legal ground or the contractual terms with which the content is incompatible, with an explanation;
  • Information on possibilities for redress, including internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress.

A template for such statements of reasons should be prepared.

In addition, since you need to explain the procedural rules in the T&Cs as mentioned above, you should also define the responsibilities and procedures for the internal complaint-handling system.

 

4. Summary of to-dos

Hosting service providers should therefore review/initiate the following before 17 February 2024

  • Designate single points of contact for authorities and recipients of the service;
  • Ensure transparency in the terms and conditions regarding content moderation;
  • Publish transparency reports on content moderation on your website;
  • Process for notifications of suspected crimes that involve a threat to the life or safety of a person/persons;
  • Establishing notice and action mechanisms;
  • Preparation of the template for the statements of reasons for decisions taken with regard to illegal/incompatible content.