Digital Services Act (DSA)
The DSA applies to intermediary services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services have their place of establishment.
The DSA lays down harmonised rules on the provision of intermediary services, sets conditional exemptions for information society services providers (which are dealt with at the end of this article) and imposes due diligence obligations on providers of intermediary services. The obligations differ depending on the type of intermediary services that are provided. Hence the definitions are important.
In the DSA:
an intermediary service means one of the following information society services: (i) a mere conduit service, (ii) a caching service; (iii) a hosting service (each as defined in Article 3).
an online platform is a hosting service. An online search engine is an intermediary service.
recipient of the service means any natural or legal person who uses an intermediary service, in particular for the purposes of seeking information or making it accessible.
Due Diligence Obligations (providers of intermediary services)
Providers of intermediary services must:
- designate a single point of contact to enable them to communicate directly, by electronic means, with Member States’ authorities, the EU Commission and the European Board for Digital Services (referred to in Article 61) for the application of the DSA (Article 11(1)).
- designate a single point of contact to enable recipients of the service to communicate directly and rapidly with them, by electronic means and in a user-friendly manner, including by allowing recipients of the service to choose the means of communication, which must not solely rely on automated tools (Article 12(1)).
- where they do not have an establishment in the Union but offer services in the Union, designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services. (Article 13(1)). It must be possible for the designated legal representative to be held liable for non-compliance with obligations under the DSA, without prejudice to the liability and legal actions that could be initiated against the provider of intermediary services (Article 13(3)).
- In their terms and conditions (i) include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service,(which includes information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system); (ii) inform the recipients of the service of any significant change to the terms and conditions (iii) where an intermediary service is primarily directed at minors or is predominantly used by them, the provider of that intermediary service must explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand (Article 14(1)(2) and (3)).
- act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in Article 14(1) (Article 14(4)).
Due Diligence Obligations (providers of hosting services including online platforms)
Providers of hosting services must:
- put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content (as defined under Article 7 below). Those mechanisms must (i) allow for the submission of notices exclusively by electronic means(ii)be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services must take the necessary measures to enable and to facilitate the submission of notices containing all of the elements referred to in Article 16(2)) (not reproduced here) (Articles 16(1) and (2)).
- Where the notice contains the electronic contact information of the individual or entity that submitted it, without undue delay, send a confirmation of receipt of the notice to that individual or entity (Article 16(4)).
- also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision (Article 16(5)).
- process any notices that they receive under the mechanisms referred to in Article 16(1) and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision making, they must include information of such use in the notification referred to in Article 16(5) (Article 16(6)).
- subject to some exceptions, provide a clear and specific statement of reasons (meeting the requirements of Article 17(3)) to any affected recipients of the service for any of the restrictions set out in Article 17(1) imposed on the ground that the information provided by the recipient of the service is illegal content or incompatible with their terms and conditions (Article 17(1)).
The information provided must be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information must in particular, be such as to reasonably allow the recipient of the service concerned to effectively exercise the possibilities for redress referred to in of paragraph 3, point (f).
(Article 17(4)) (Article 17(3)(f) provides: clear and user-friendly information on the possibilities for redress available to the recipient of the service in respect of the decision, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress).
The Article will not apply to any orders referred to in Article 9(Article 17(5)) (Orders to act against illegal content).
- promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned, where it becomes aware of any information (giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons has taken place, is taking place or is likely to take place), of its suspicion and provide all relevant information available (Article 18(1)).
Due Diligence Obligations (providers of online platforms)
Providers of online platforms must :
- take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16, are given priority and are processed and decided upon without undue delay (Article 22(1)).
- not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions (Article 25(1)).
The prohibition in Article 25(1) will not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679 (GDPR) (Article 25(2)). The Commission may issue guidelines on how Article 25(1) applies to specific practices (Article25(3)).
The status of ‘trusted flagger’ under the DSA will be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:
- (a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content;
- (b) it is independent from any provider of online platforms;
- (c) it carries out its activities for the purposes of submitting notices diligently, accurately and objectively(Article 22(2)).
Trusted flaggers must publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 during the relevant period. The report must list at least the number of notices categorised by: (a) the identity of the provider of hosting services, (b) the type of allegedly illegal content notified, (c) the action taken by the provider. Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence. Trusted flaggers must send those reports to the awarding Digital Services Coordinator, and must make them publicly available. The information in those reports must not contain personal data (Article 22(3)).
Measures and protection against abuse (Article 23)
Providers of online platforms must:
- suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content (Article 23(1)).
- suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints handling systems referred to in Article 16 (notice and action mechanisms) and Article 20 (Internal complaint-handling system), respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded (Article 23(2)). Article 23(3) sets out the criteria for assessing whether to suspend.
- set out, in a clear and detailed manner, in their terms and conditions their policy in respect of the misuse referred to in Articles 24(1) and (2) and must give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension (Article 23(4)).
- undertake the transparency reporting obligations as set out in Article 24.
Advertising on online platforms (Article 26)
Providers of online platforms that present advertisements on their online interfaces must:
- ensure that, for each specific advertisement presented to each individual recipient, the recipients of the service are able to identify, in a clear, concise and unambiguous manner and in real time, the following:
- (a) that the information is an advertisement, including through prominent markings, which might follow standards pursuant to Article 44;
- (b) the natural or legal person on whose behalf the advertisement is presented;
- (c) the natural or legal person who paid for the advertisement if that person is different from the natural or legal person referred to in point (b);
- (d) meaningful information directly and easily accessible from the advertisement about the main parameters used to determine the recipient to whom the advertisement is presented and, where applicable, about how to change those parameters (Article 26(1)).
- provide recipients of the service with a functionality to declare whether the content they provide is or contains commercial communications. When the recipient of the service submits a declaration pursuant to this paragraph, the provider of online platforms must ensure that other recipients of the service can identify in a clear and unambiguous manner and in real time, including through prominent markings, which might follow standards pursuant to Article 44 (Standards) that the content provided by the recipient of the service is or contains commercial communications, as described in that declaration (Article 26(2)).
- not present advertisements to recipients of the service based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using special categories of personal data referred to in Article 9(1) of Regulation (EU) 2016/679 (Article 26(3)). Article 9(1) : Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited).
Profiling means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.
Recommender system transparency (Article 27)
Recommender system means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed.
Providers of online platforms that use recommender systems must set out in their terms and conditions, in plain and intelligible language, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters (Article 27(1)).
The main parameters referred to in Article 27(1) must explain why certain information is suggested to the recipient of the service. They shall include, at least: (a) the criteria which are most significant in determining the information suggested to the recipient of the service; (b) the reasons for the relative importance of those parameters (Article 27(2)).
Where several options are available pursuant to article 27(1) or recommender systems that determine the relative order of information presented to recipients of the service, providers of online platforms must also make available a functionality that allows the recipient of the service to select and to modify at any time their preferred option. That functionality must be directly and easily accessible from the specific section of the online platform’s online interface where the information is being prioritised (Article 27(3)).
Online protection of minors (Article 28)
Providers of online platforms accessible to minors must put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service (Article 28(1)).
Providers of online platforms must not present advertisements on their interface, based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 (profiling) using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor (Article 28(2)).
Compliance with the obligations set out in this Article must not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor (Article 28(3)).
The Commission, after consulting the Board, may issue guidelines to assist providers of online platforms in the application of Article 28(1) (Article 28(4)).
Traceability of traders (Article 30)
Providers of online platforms allowing consumers to conclude distance contracts with traders must ensure that traders can only use those online platforms to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of their services for those purposes, they have obtained specified information from the trader.
The Article 30 provision (Article 30(1) to 30(7) inclusive) requires detailed review as it impacts the functionality of the platform, and the terms of business. The provision is in line with the approach taken by the EU Commission in the Market Surveillance Regulation.
Compliance by design (Article 31)
Providers of online platforms allowing consumers to conclude distance contracts with traders must:
- ensure that its online interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under applicable Union law;
- ensure that its online interface enables traders to provide information on the name, address, telephone number and email address of the economic operator, as defined in Article 3, point (13), of Regulation (EU) 2019/1020 and other Union law (Article 31(1)).
- ensure that its online interface is designed and organised in a way that it allows traders to provide at least the following: (a) the information necessary for the clear and unambiguous identification of the products or the services promoted or offered to consumers located in the Union through the services of the providers; (b) any sign identifying the trader such as the trademark, symbol or logo; and, (c) where applicable, the information concerning the labelling and marking in compliance with rules of applicable Union law on product safety and product compliance (Article 31(2)).
- make best efforts to assess whether such traders have provided the information referred to Articles 31(1) and 31(2) prior to allowing them to offer their products or services on those platforms (Article 31(3)).
After allowing the trader to offer products or services on its online platform that allows consumers to conclude distance contracts with traders, the provider shall make reasonable efforts to randomly check in any official, freely accessible and machine-readable online database or online interface whether the products or services offered have been identified as illegal (Article 31(3)).
Right to information (Article 32)
Where a provider of an online platform allowing consumers to conclude distance contracts with traders, becomes aware, irrespective of the means used, that an illegal product or service has been offered by a trader to consumers located in the Union through its services, that provider must inform, insofar as it has their contact details, consumers who purchased the illegal product or service through its services of the following: (a) the fact that the product or service is illegal; (b) the identity of the trader; and (c) any relevant means of redress.
The obligation laid down in the first subparagraph will be limited to purchases of illegal products or services made within the six months preceding the moment that the provider became aware of the illegality (Article 32(1)).
Where, under Article 32(1), the provider of the online platform allowing consumers to conclude distance contracts with traders, does not have the contact details of all consumers concerned, that provider shall make publicly available and easily accessible on its online interface the information concerning the illegal product or service, the identity of the trader and any relevant means of redress (Article 32(2)).
What constitutes a Very Large Online Platform (VLOP) or Very Large Online Search Engine (VLOSE) (Article 33)
Section 5 of the DSA applies to online platforms and online search engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as VLOP or VLOSE under Article 33(4) (Article 33(1).)
Adjusting the number of active recipients
The Commission must adopt delegated acts to adjust the number of average monthly active recipients of the service in the Union where the Union’s population increases or decreases at least by 5% in relation to its population in 2020 or its population after adjustment by means of a delegated act in the year in which the latest delegated act was adopted. In such a case, it must adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in million (Article 33(2)).
Supplementing the provisions of the DSA
The Commission may adopt delegated acts, after consulting the Board, to supplement the provisions of the DSA by laying down the methodology for calculating the number of average monthly active recipients of the service in the Union (Article 33(3)).
Designating VLOPs and VLSE
VLOPs or VLOSEs are designated using the procedure in Article 33(4) ( not reproduced here).
The Commission must terminate the designation if, during an uninterrupted period of one year, the online platform or the online search engine does not have a number of average monthly active recipients of the service equal to or higher than the number referred to in Article 33(1) (Article 33(5)).
Providers of very large online platforms (VLOPS) and providers of very large online search engines (VLSEs), must
- provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms and conditions, including the available remedies and redress mechanisms, in clear and unambiguous language and must publish their terms and conditions in the official languages of all the Member States in which they offer their services (Article 14(5) and (6)).
Providers of very large online platforms (VLOPs) and of very large online search engines (VLOSEs) must:
- diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services. In broad terms the risk assessment must be carried at least once a year and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article 34.
The risk assessment must take into consideration (a) the dissemination of illegal content (b) any actual or foreseeable negative effects for the exercise of fundamental rights (c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security; (d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being (Article 34(1)).
Additional online advertising transparency (Article 39)
Providers of VLOPs or VLOSEs that present advertisements on their online interfaces must
- compile and make publicly available in a specific section of their online interface, through a searchable and reliable tool that allows multi-criteria queries and through application programming interfaces, a repository containing the information referred to in Article 39(2), for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online interfaces. They must ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete (Article 39(1)).
The repository must include at least all of the following information:
- (a) the content of the advertisement, including the name of the product, service or brand and the subject matter of the advertisement;
- (b) the natural or legal person on whose behalf the advertisement is presented;
- (c) the natural or legal person who paid for the advertisement, if that person is different from the person referred to in point (b);
- (d) the period during which the advertisement was presented;
- (e) whether the advertisement was intended to be presented specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including where applicable the main parameters used to exclude one or more of such particular groups;
- (f) the commercial communications published on the VLOPs identified pursuant to Article 26(2);
- (g) the total number of recipients of the service reached and, where applicable, aggregate numbers broken down by Member State for the group or groups of recipients that the advertisement specifically targeted (Article 39(2)). See additionally Article 39(3).
Recommender systems VLOP or VLOSE (Article 38)
In addition to Article 27, providers of VLOPs or VLOSEs that use recommender systems must provide at least one option for each of their recommender systems which is not based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679.
Data access and scrutiny VLOPs or VLOSEs (Article 40)
Providers of VLOPs or VLOSEs must provide the Digital Services Coordinator of establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with this Regulation (Article 40(1)).
For the purposes of Article 40(1), providers of VLOPS or VLOSEs must, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender systems (Article 40(3)).
Data access and scrutiny VLOPs or VLOSEs (Article 40) cont
Upon a reasoned request from the Digital Services Coordinator of establishment, providers of VLOPs or VLOSEs must within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in Article 40(8), for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35 (not reproduced here) (Article 40(4)).
What are Vetted Researchers (Article 40(8))
Upon a duly substantiated application from researchers, the Digital Services Coordinator of establishment must grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of very large online platform or of very large online search engine a pursuant to paragraph 4, where the researchers demonstrate that they meet all of the following conditions (cont below).
- (a) they are affiliated to a research organisation as defined in Article 2, point (1), of Directive (EU) 2019/790;
- (b) they are independent from commercial interests;
- (c) their application discloses the funding of the research;
- (d) they are capable of fulfilling the specific data security and confidentiality requirements corresponding to each request and to protect personal data, and they describe in their request the appropriate technical and organisational measures that they have put in place to this end;
- (e) their application demonstrates that their access to the data and the time frames requested are necessary for, and proportionate to, the purposes of their research, and that the expected results of that research will contribute to the purposes laid down in Article 40(4).
- (f) the planned research activities will be carried out for the purposes laid down in Article 40(4).
- (g) they have committed themselves to making their research results publicly available free of charge, within a reasonable period after the completion of the research, subject to the rights and interests of the recipients of the service concerned, in accordance with Regulation (EU) 2016/679. Upon receipt of the application pursuant to this paragraph, the Digital Services Coordinator of establishment shall inform the Commission and the Board (Article 40(8)).
Conditional exemptions for providers of information society services
The conditions under which providers of information society services being mere conduit (Article 4), caching (automatic, intermediate and temporary storage)(Article 5) and hosting services (Article 6) are exempt from liability for third-party information are set out in those Articles. The specific wording of each Article and Articles 7 to 10 below) require detailed review..
Voluntary own-initiative investigations and legal compliance (Article 7)
Illegal content as used in the DSA means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law.
Providers of intermediary services will not be deemed ineligible for the exemptions from liability referred to in Articles 4, 5 and 6 solely because they, in good faith and in a diligent manner, carry out voluntary own-initiative investigations into, or take other measures aimed at detecting, identifying and removing, or disabling access to, illegal content, or take the necessary measures to comply with the requirements of Union law and national law in compliance with Union law, including the requirements set out in this Regulation.
No general monitoring or active fact-finding obligations (Article 8)
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.
Orders to act against illegal content (Article 9)
Upon the receipt of an order to act against one or more specific items of illegal content, issued by the relevant national judicial or administrative authorities, providers of intermediary services must inform the authority issuing the order, or any other authority specified in the order, of any effect given to the order without undue delay, specifying if and when effect was given to the order.
Orders to provide information (Article 10)
Upon receipt of an order to provide specific information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities, providers of intermediary services must, without undue delay inform the authority issuing the order, or any other authority specified in the order, of its receipt and of the effect given to the order, specifying if and when effect was given to the order.
Entry into force and application (Article 93)
The DSA entered into force on the twentieth day following that of its publication in the Official Journal of the European Union.
The DSA will apply from 17 February 2024.
However, Article 24(2) (3) and (6) (Transparency reporting obligations for providers of online platforms) Article 33(3) to (6) (Very large online platforms and very large online search engines), Article 37(7) (Independent audit), Article 40(13) (Data access and scrutiny), Article 43 (Supervisory fee) and Sections 4, 5 and 6 of Chapter IV shall apply from 16 November 2022.
Chapter 4 relates to Implementation, Cooperation, Penalties and Enforcement.
Section 4 of Chapter 4 relates to Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engine.
Section 5 of Chapter 4 relates to common provisions on enforcement.
Section 6 of Chapter 4 relates to delegated and implementing acts.
Sections 4,5 and 6 of Chapter 4 comprise Articles 64 to 88 of the DSA.
Copyright © Paul Foley January 2023 - All Rights Reserved.
Owner, Paul Foley Law
For legal advice on and compliance with the Platforms Regulation, the Ranking Guidelines, the DSA, DMA and the EU Artificial Intelligence Act, please use the Contact page or Email: paul@paulfoleylaw.ie