paul@paulfoleylaw.ie
22 Northumberland Road, Dublin D04 ED73, Ireland, EU
INTRO
INSIGHTS

Digital Services Act - DSA

By
Paul Foley
The Digital Services Act sets harmonised rules on the provision of intermediary services in the EU.


Digital Services Act (DSA)

The DSA applies to intermediary services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services have their place of establishment.

The DSA lays down harmonised rules on the provision of intermediary services, sets conditional exemptions from liability for information society services providers (which are dealt with at the end of this article) and imposes due diligence obligations on providers of intermediary services. The obligations differ depending on the type of intermediary services that are provided. Hence the definitions are important. Finally the DSA establishes rules on the implementation and enforcement of the DSA, including as regards the cooperation of and coordination between the competent authorities.

In the DSA:

an intermediary service means one of the following information society services: (i) a mere conduit service, (ii) a caching service; (iii) a hosting service (each as defined in Article 3).

an online platform is a hosting service. An online search engine is an intermediary service.

recipient of the service means any natural or legal person who uses an intermediary service, in particular for the purposes of seeking information or making it accessible.

Due Diligence Obligations (providers of intermediary services)

Providers of intermediary services must (extracts only):

  • designate a single point of contact to enable them to communicate directly, by electronic means, with Member States’ authorities, the EU Commission and the European Board for Digital Services (referred to in Article 61) for the application of the DSA (Article 11(1)).

  • designate a single point of contact to enable recipients of the service to communicate directly and rapidly with them, by electronic means and in a user-friendly manner, including by allowing recipients of the service to choose the means of communication, which must not solely rely on automated tools (Article 12(1)).

  • where they do not have an establishment in the Union but offer services in the Union, designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services. (Article 13(1)). It must be possible for the designated legal representative to be held liable for non-compliance with obligations under the DSA, without prejudice to the liability and legal actions that could be initiated against the provider of intermediary services (Article 13(3)).

  • In their terms and conditions (i) include information on any restrictions that they impose in relation to the use of their service in respect of information provided by the recipients of the service, (which must include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review, as well as the rules of procedure of their internal complaint handling system); (ii) inform the recipients of the service of any significant change to the terms and conditions (iii) where an intermediary service is primarily directed at minors or is predominantly used by them, the provider of that intermediary service must explain the conditions for, and any restrictions on, the use of the service in a way that minors can understand (Article 14(1)(2) and (3)).

  • act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in Article 14(1) (Article 14(4)).

Transparency reporting obligations for providers of intermediary services (Article 15)

Providers of intermediary services must make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content moderation that they engaged in during the relevant period. Those reports must include, in particular, information on the following, as applicable:

  1. for providers of intermediary services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

  2. for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, the number of naotices processed by using automated means and the median time needed for taking the action;

  3. for providers of intermediary services, meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported must be categorised by the type of illegal content or violation of the terms and conditions of the service provider, by the detection method and by the type of restriction applied;


  4. for providers of intermediary services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms and conditions and additionally, for providers of online platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

  5. any use made of automated means for the purpose of content moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied (Article 15(1))

Article 15(1) will not apply to providers of intermediary services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online platforms within the meaning of Article 33 of the DSA (Article 15(2)).

The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports under Article15(1), including harmonised reporting periods. Those implementing acts must be adopted in accordance with the advisory procedure referred to in Article 88 (Article 15(3)).

Due Diligence Obligations (providers of hosting services including online platforms)

Providers of hosting services must:

  • put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content (as defined under Article 7 below). Those mechanisms must (i) allow for the submission of notices exclusively by electronic means(ii)be such as to facilitate the submission of sufficiently precise and adequately substantiated notices. To that end, the providers of hosting services must take the necessary measures to enable and to facilitate the submission of notices containing all of the elements referred to in Article 16(2)) (not reproduced here) (Articles 16(1) and (2)).

  • Where the notice contains the electronic contact information of the individual or entity that submitted it, without undue delay, send a confirmation of receipt of the notice to that individual or entity (Article 16(4)).

  • also, without undue delay, notify that individual or entity of its decision in respect of the information to which the notice relates, providing information on the possibilities for redress in respect of that decision (Article 16(5)).

  • process any notices that they receive under the mechanisms referred to in Article 16(1) and take their decisions in respect of the information to which the notices relate, in a timely, diligent, non-arbitrary and objective manner. Where they use automated means for that processing or decision making, they must include information of such use in the notification referred to in Article 16(5) (Article 16(6)).

  • subject to some exceptions, provide a clear and specific statement of reasons (meeting the requirements of Article 17(3)) (not reproduced here) to any affected recipients of the service for any of the restrictions set out in Article 17(1) imposed on the ground that the information provided by the recipient of the service is illegal content or incompatible with their terms and conditions (Article 17(1)).

    The information provided must be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information must in particular, be such as to reasonably allow the recipient of the service concerned to effectively exercise the possibilities for redress referred to in of paragraph 3, point (f). (Article 17(4)). (Article 17(3)(f) provides: clear and user-friendly information on the possibilities for redress available to the recipient of the service in respect of the decision, in particular, where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress). The Article will not apply to any orders referred to in Article 9 (Article 17(5)) (Orders to act against illegal content).

  • promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned, where it becomes aware of any information (giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons has taken place, is taking place or is likely to take place), of its suspicion and provide all relevant information available (Article 18(1)).

Due Diligence Obligations (providers of online platforms)

Providers of online platforms must provide access to a complaints handling system, to recipients of the service, including individuals or entities that have submitted a notice (under Article 16) against decisions (see just below) taken by the provider in relation to content, on the grounds that the information provided by the recipients is illegal or in breach of the provider terms and conditions:

  1. decisions whether or not to remove or disable access to or restrict visibility of the information;

  2. decisions whether or not to suspend or terminate the provision of the service, in whole or in part, to the recipients;

  3. decisions whether or not to suspend or terminate the recipients’ account;

  4. decisions whether or not to suspend, terminate or otherwise restrict the ability to monetise information provided by the recipients (Article21(1) extract) (Article20(1))

The access period must be for at least six months following any decision referred to just above. The complaints handling system must amongst other requirements, be easy to use (Article 20(3). Article 21(4) sets out when a provider must reverse its decision taken under Article 20(1).

The period of six months starts on the date the recipient of the service is informed about the decision in accordance with Article 16(5) or Article 17.

Providers must inform complainants without undue delay of their reasoned decision in respect of the information to which the complaint relates and of the possibility of out-of-court dispute settlement provided for in Article 21 (see at the end of this Article) and other available possibilities for redress.

Providers of online platforms must also:

  • take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the mechanisms referred to in Article 16 (Notice and Action Mechanisms), are given priority and are processed and decided upon without undue delay (Article 22(1)).

  • not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions (Article 25(1)).

The prohibition in Article 25(1) will not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679 (GDPR) (Article 25(2)). The Commission may issue guidelines on how Article 25(1) applies to specific practices (Article25(3)).

The status of ‘trusted flagger’ under the DSA will be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:

  • (a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content;

  • (b) it is independent from any provider of online platforms;

  • (c) it carries out its activities for the purposes of submitting notices diligently, accurately and objectively (Article 22(2)).

Trusted flaggers must publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 (Notice and Action Mechanisms) during the relevant period. The report must list at least the number of notices categorised by: (a) the identity of the provider of hosting services, (b) the type of allegedly illegal content notified, (c) the action taken by the provider. Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence. Trusted flaggers must send those reports to the awarding Digital Services Coordinator, and must make them publicly available. The information in those reports must not contain personal data (Article 22(3)).

Transparency reporting obligations for providers of online platforms (Article 24)

In addition to the information referred to in Article 15, providers of online platforms must include in the reports referred to in that Article information on the following:

  1. the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online platform implemented the decisions of the body;

  2. the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints (Article 24(1).

By 17 February 2023 and at least once every six months thereafter, providers must publish for each online platform or online search engine, in a publicly available section of their online interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted (Article 24(2))

Providers of online platforms or of online search engines must communicate to the Digital Services Coordinator of establishment and the Commission, upon their request and without undue delay, the information referred to in Article 24 (2) updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online platform or of the online search engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information must not include personal data (Article 24(3)).

When the Digital Services Coordinator of establishment has reasons to consider, based on the information received pursuant to Article 24(2) and (3), that a provider of online platforms or of online search engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it must inform the Commission thereof (Article 24(4)).

Providers of online platforms must, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online platforms must ensure that the information submitted does not contain personal data (Article 24(5)).

The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to Article 24 (1). Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88 (Article 24(6)).

Measures and protection against misuse (Article 23)

Providers of online platforms must:

  • suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content (Article 23(1)).

  • suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints handling systems referred to in Article 16 (notice and action mechanisms) and Article 20 (Internal complaint-handling system), respectively, by individuals or entities or by complainants that frequently submit notices or complaints that are manifestly unfounded (Article 23(2)). Article 23(3) sets out the criteria for assessing whether to suspend.

  • set out, in a clear and detailed manner, in their terms and conditions their policy in respect of the misuse referred to in Articles 23(1) and (2) and they must give examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension (Article 23(4)).

  • undertake the transparency reporting obligations as set out in Article 24 (Transparency reporting obligations for providers of online platforms).

Advertising on online platforms (Article 26)

Providers of online platforms that present advertisements on their online interfaces must:

  • ensure that, for each specific advertisement presented to each individual recipient, the recipients of the service are able to identify, in a clear, concise and unambiguous manner and in real time, the following:

    • (a) that the information is an advertisement, including through prominent markings, which might follow standards pursuant to Article 44 (Standards);
    • (b) the natural or legal person on whose behalf the advertisement is presented;
    • (c) the natural or legal person who paid for the advertisement if that person is different from the natural or legal person referred to in point (b);
    • (d) meaningful information directly and easily accessible from the advertisement about the main parameters used to determine the recipient to whom the advertisement is presented and, where applicable, about how to change those parameters (Article 26(1)).

  • provide recipients of the service with a functionality to declare whether the content they provide is or contains commercial communications. When the recipient of the service submits a declaration pursuant to this paragraph, the provider of online platforms must ensure that other recipients of the service can identify in a clear and unambiguous manner and in real time, including through prominent markings, which might follow standards pursuant to Article 44 (Standards) that the content provided by the recipient of the service is or contains commercial communications, as described in that declaration (Article 26(2)).

  • not present advertisements to recipients of the service based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using special categories of personal data referred to in Article 9(1) of Regulation (EU) 2016/679 (Article 26(3)). Article 9(1) : Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited).

Profiling means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements. (Article 4, point (4), of Regulation (EU) 2016/679).

Recommender system transparency (Article 27)

Recommender system means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed (Article3(s)).

Providers of online platforms that use recommender systems must set out in their terms and conditions, in plain and intelligible language, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters (Article 27(1)).

The main parameters referred to in Article 27(1) must explain why certain information is suggested to the recipient of the service. They shall include, at least: (a) the criteria which are most significant in determining the information suggested to the recipient of the service; (b) the reasons for the relative importance of those parameters (Article 27(2)).

Where several options are available pursuant to article 27(1) for recommender systems that determine the relative order of information presented to recipients of the service, providers of online platforms must also make available a functionality that allows the recipient of the service to select and to modify at any time their preferred option. That functionality must be directly and easily accessible from the specific section of the online platform’s online interface where the information is being prioritised (Article 27(3)).

Online protection of minors (Article 28)

Providers of online platforms accessible to minors must put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service (Article 28(1)).

Providers of online platforms must not present advertisements on their interface, based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 (profiling) using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor (Article 28(2)).

Compliance with the obligations set out in this Article must not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor (Article 28(3)). The Commission, after consulting the Board, may issue guidelines to assist providers of online platforms in the application of Article 28(1) (Article 28(4)).

Traceability of traders (Article 30)

Providers of online platforms allowing consumers to conclude distance contracts with traders must ensure that traders can only use those online platforms to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of their services for those purposes, they have obtained specified information from the trader (not reproduced here).

The Article 30 provision (Article 30(1) to 30(7) inclusive) (not reproduced here) requires detailed review as it impacts the functionality of the online platform, and the terms of business. The provision is in line with the approach taken by the EU Commission in the Market Surveillance Regulation.

Compliance by design (Article 31)

Providers of online platforms allowing consumers to conclude distance contracts with traders must:

  • ensure that its online interface is designed and organised in a way that enables traders to comply with their obligations regarding pre-contractual information, compliance and product safety information under applicable Union law;
  • ensure that its online interface enables traders to provide information on the name, address, telephone number and email address of the economic operator, as defined in Article 3, point (13), of Regulation (EU) 2019/1020 and other Union law (Article 31(1)).
  • ensure that its online interface is designed and organised in a way that it allows traders to provide at least the following: (a) the information necessary for the clear and unambiguous identification of the products or the services promoted or offered to consumers located in the Union through the services of the providers; (b) any sign identifying the trader such as the trademark, symbol or logo; and, (c) where applicable, the information concerning the labelling and marking in compliance with rules of applicable Union law on product safety and product compliance (Article 31(2)).
  • make best efforts to assess whether such traders have provided the information referred to Articles 31(1) and 31(2) prior to allowing them to offer their products or services on those platforms (Article 31(3)).

After allowing the trader to offer products or services on its online platform that allows consumers to conclude distance contracts with traders, the provider must make reasonable efforts to randomly check in any official, freely accessible and machine-readable online database or online interface whether the products or services offered have been identified as illegal (Article 31(3)).

Right to information (Article 32)

Where a provider of an online platform allowing consumers to conclude distance contracts with traders, becomes aware, irrespective of the means used, that an illegal product or service has been offered by a trader to consumers located in the Union through its services, that provider must inform, insofar as it has their contact details, consumers who purchased the illegal product or service through its services of the following(a) the fact that the product or service is illegal; (b) the identity of the trader; and (c) any relevant means of redress.

The obligation laid down in the first subparagraph will be limited to purchases of illegal products or services made within the six months preceding the moment that the provider became aware of the illegality (Article 32(1)).

Where, under Article 32(1), the provider of the online platform allowing consumers to conclude distance contracts with traders, does not have the contact details of all consumers concerned, that provider shall make publicly available and easily accessible on its online interface the information concerning the illegal product or service, the identity of the trader and any relevant means of redress (Article 32(2)).

What constitutes a Very Large Online Platform (VLOP) or Very Large Online Search Engine (VLOSE) (Article 33)

Section 5 of the DSA i applies toonline platforms and online search engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as VLOP or VLOSE under Article 33(4) (Article 33(1)).

Adjusting the number of active recipients

The Commission must adopt delegated acts to adjust the number of average monthly active recipients of the service in the Union where the Union’s population increases or decreases at least by 5% in relation to its population in 2020 or its population after adjustment by means of a delegated act in the year in which the latest delegated act was adopted. In such a case, it must adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in million (Article 33(2)).

Supplementing the provisions of the DSA

The Commission may adopt delegated acts, after consulting the Board (European Board for Digital Services), to supplement the provisions of the DSA by laying down the methodology for calculating the number of average monthly active recipients of the service in the Union (Article 33(3)).

Designating VLOPs and VLSE

Providers of VLOPs or VLOSEs are designated by the Commission using the procedure in Article 33(4) (not reproduced here).

The Commission must terminate the designation if, during an uninterrupted period of one year, the online platform or the online search engine does not have a number of average monthly active recipients of the service equal to or higher than the number referred to in Article 33(1) (Article 33(5)).

VLOPs and VLOSEs: specific obligations (Arts 35 to 37)

Providers of VLOPS and VLOSES must

  • provide recipients of services with a concise, easily-accessible and machine-readable summary of the terms and conditions, including the available remedies and redress mechanisms, in clear and unambiguous language and must publish their terms and conditions in the official languages of all the Member States in which they offer their services (Article 14(5) and (6)).

Providers of VLOPs and VLOSEs must:

  • diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services. In broad terms the risk assessment must be carried at least once a year and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article 34.

This risk assessment must be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:

The risk assessment must include the following systemic risks (a) the dissemination of illegal content (b) any actual or foreseeable negative effects for the exercise of fundamental rights (c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security; (d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being (Article 34(1)).

VLOPs and VLOSEs: Mitigation of risks (Article 35)

Providers of VLOPs and VLOSEs must put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. What such measures may include are set out in detail in Article 35.

VLOPs and VLOSEs: Crisis response mechanism (Article 36)

Where extraordinary circumstances lead to a serious threat to public security or public health in the EU or significant parts of it (a crisis), the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of VLOPs or VLOSEs to take one or more of the following actions:

  1. assess whether, and if so to what extent and how, the functioning and use of their services significantly contribute to a serious threat as referred to in Article 36(2) (ie the crisis as defined), or are likely to do so;

  2. identify and apply specific, effective and proportionate measures, such as any of those provided for in Article 35(1) or Article 48(2), to prevent, eliminate or limit any such contribution to the serious threat identified pursuant to point (a) of this paragraph;

  3. report to the Commission by a certain date or at regular intervals specified in the decision, on the assessments referred to in point (a), on the precise content, implementation and qualitative and quantitative impact of the specific measures taken pursuant to point (b) and on any other issue related to those assessments or those measures, as specified in the decision (Article36(1)).

When taking the decision referred to in Article 36(1), the Commission must amongst other things, allow a reasonable time for them to be taken and must limit the period in which they must be taken to three months (extendable by a further three months). The Commission must publish any notices it issues that require such steps to be taken.

VLOPs and VLOSs: Independent audit (Article 37)

Providers of VLOPs and VLOSEs will be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following: (a) the obligations set out in Chapter III; (b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48 (Article 37(1)) (these Articles are not reproduced in this Article).

Article 37(2) requires the organisation undertaking the audit to be afforded the necessary cooperation and assistance by the VLOPs and VLOSEs.   

Article 37(3) sets out the requirements for the organisation performing the audit.   

Providers of VLOPs and VLOSEs must ensure that the organisations that perform the audits establish an audit report for each audit. The report must contain the information required by Article 37(4).

Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited (Article 37(5)).

Providers of VLOPs or VLOSEs receiving an audit report that is not ‘positive’ must take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They must , within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they must justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified (Article 37(6)).

The Commission is empowered to adopt delegated acts (Article 87) to supplement the DSA by laying down the necessary rules for the performance of the audits pursuant to Article 37 (not summarised here), in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts must take into account any voluntary auditing standards referred to in Article 44(1), point (e) (Art 36(7)).

Additional online advertising transparency (Article 39)

Providers of VLOPs or VLOSEs that present advertisements on their online interfaces must

  • compile and make publicly available in a specific section of their online interface, through a searchable and reliable tool that allows multi-criteria queries and through application programming interfaces, arepository containing the information referred to in Article 39(2), for the entire period during which they present an advertisement and until one year after the advertisement was presented for the last time on their online interfacesThey must ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been presented, and shall make reasonable efforts to ensure that the information is accurate and complete (Article 39(1)).

The repository must include at least all of the following information:

  • (a) the content of the advertisement, including the name of the product, service or brand and the subject matter of the advertisement;

  • (b) the natural or legal person on whose behalf the advertisement is presented;

  • (c) the natural or legal person who paid for the advertisement, if that person is different from the person referred to in point (b);

  • (d) the period during which the advertisement was presented;

  • (e) whether the advertisement was intended to be presented specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose including where applicable the main parameters used to exclude one or more of such particular groups;

  • (f) the commercial communications published on the VLOPs identified pursuant to Article 26(2);

  • (g) the total number of recipients of the service reached and, where applicable, aggregate numbers broken down by Member State for the group or groups of recipients that the advertisement specifically targeted (Article 39(2)). See additionally Article 39(3).

Recommender systems VLOP or VLOSE (Article 38)

In addition to Article 27, providers of VLOPs or VLOSEs that use recommender systems must provide at least one option for each of their recommender systems which is not based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 (GDPR).

Data access and scrutiny VLOPs or VLOSEs (Article 40)

Providers of VLOPs or VLOSEs must provide the Digital Services Coordinator of establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with the DSA (Article 40(1)).

For the purposes of Article 40(1), providers of VLOPs or VLOSEs must, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender systems (Article 40(3)).

Data access and scrutiny VLOPs or VLOSEs (Article 40) cont

Upon a reasoned request from the Digital Services Coordinator of establishment, providers of VLOPs or VLOSEs must within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in Article 40(8), for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35 (not reproduced here) (Article 40(4)).

What are Vetted Researchers (Article 40(8))

Upon a duly substantiated application from researchers, the Digital Services Coordinator of establishment must grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of VLOPs or of VLOSEs a pursuant to paragraph 4 (Article 40(4)), where the researchers demonstrate that they meet all of the following conditions (cont below):

  • (a) they are affiliated to a research organisation as defined in Article 2, point (1), of Directive (EU) 2019/790;

  • (b) they are independent from commercial interests;

  • (c) their application discloses the funding of the research;

  • (d) they are capable of fulfilling the specific data security and confidentiality requirements corresponding to each request and to protect personal data, and they describe in their request the appropriate technical and organisational measures that they have put in place to this end;

  • (e) their application demonstrates that their access to the data and the time frames requested are necessary for, and proportionate to, the purposes of their research, and that the expected results of that research will contribute to the purposes laid down in Article 40(4).

  • (f) the planned research activities will be carried out for the purposes laid down in Article 40(4).

  • (g) they have committed themselves to making their research results publicly available free of charge, within a reasonable period after the completion of the research, subject to the rights and interests of the recipients of the service concerned, in accordance with Regulation (EU) 2016/679. Upon receipt of the application pursuant to this paragraph, the Digital Services Coordinator of establishment shall inform the Commission and the Board (Article 40(8)).

VLOPs and VLOSEs: Compliance (Article 42)

Providers of VLOPs and VLOSEs must establish a compliance function. Amongst other requirements, the head of the compliance function must report directly to the management body of the provider of the VLOP or VSOE as the case may be.

VLOPs and VLOSEs Transparency reporting obligations (Article 42)

Providers of VLOPs and VLOSEs must publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months (Art 42(1)). The reports must, in addition to the information referred to in Article 15 and Article 24(1), specify:

  1. the human resources that the provider of very large online platforms dedicates to content moderation in respect of the service offered in the Union, broken down by each applicable official language of the Member States, including for compliance with the obligations set out in Articles 16 and 22, as well as for compliance with the obligations set out in Article 20;

  2. the qualifications and linguistic expertise of the persons carrying out the activities referred to in point (a), as well as the training and support given to such staff; (c) the indicators of accuracy and related information referred to in Article 15(1), point (e), broken down by each official language of the Member States.

The reports must be published in at least one of the official languages of the Member States.

In addition to the information referred to in Articles 24(2), the providers of VLOPS or of VLOSEs must include in the reports referred to in Article 42(1) the information on the average monthly recipients of the service for each Member State.

Conditional exemptions for providers of information society services (Articles 4 to 10)

The conditions under which providers of information society services being mere conduit (Article 4), caching (automatic, intermediate and temporary storage)(Article 5) and hosting services (Article 6) are exempt from liability for third-party information are set out in those Articles. The specific wording of each Article and Articles 7 to 10 below) require detailed review.

Voluntary own-initiative investigations and legal compliance (Article 7)

Illegal content as used in the DSA means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law. Further detail on what constitutes illegal content is provided elsewhere in the DSA.

Providers of intermediary services will not be deemed ineligible for the exemptions from liability referred to in Articles 4, 5 and 6 solely because they, in good faith and in a diligent manner, carry out voluntary own-initiative investigations into, or take other measures aimed at detecting, identifying and removing, or disabling access to, illegal content, or take the necessary measures to comply with the requirements of Union law and national law in compliance with Union law, including the requirements set out in the DSA.

No general monitoring or active fact-finding obligations (Article 8)

No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.

Orders to act against illegal content (Article 9)

Upon the receipt of an order to act against one or more specific items of illegal content, issued by the relevant national judicial or administrative authorities, providers of intermediary services must inform the authority issuing the order, or any other authority specified in the order, of any effect given to the order without undue delay, specifying if and when effect was given to the order.

Orders to provide information (Article 10)

Upon receipt of an order to provide specific information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities, providers of intermediary services must, without undue delay inform the authority issuing the order, or any other authority specified in the order, of its receipt and of the effect given to the order, specifying if and when effect was given to the order.

The DSA and Public bodies

Digital Services Coordinators (Art 49)

The Digital Services Coordinator is the entity responsible for all matters relating to supervision and enforcement of the DSA in a Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities (Art 49 extract).

Member States must designate one or more competent authorities to be responsible for the supervision of providers of intermediary services and enforcement of the DSA (‘competent authorities’) and by 17 February 2024. designate one of them, as their Digital Services Coordinator (Art 49 extract).

The provisions applicable to Digital Services Coordinators set out in Articles 50, 51 and 56 will also apply to any other competent authorities that the Member States designate under Article 49(1) (Art 49).

Briefly, the role of Digital Services Coordinators are specified in Article 50 (Requirements for Digital Services Coordinators) Article 51; (Powers of Digital Services Coordinators); Article 55 (Activity reports); Article 56 (Division of Competencies); Article 58 (Cross-border cooperation among Digital Services Coordinators); and Article 59 (Referral to the Commission). However, there are additional obligations for Digital Services Coordinators contained throughout the DSA.

Division of competencies: Member State and Commission (Art 56)

The Member State in which the main establishment of the provider of intermediary services is located will have exclusive powers to supervise and enforce the DSA except for the powers provided for in Art 56(2), Art 56(3) and Art 56(4) (Article 56(1))

The Commission will have exclusive powers to supervise and enforce Section 5 of Chapter III (Additional obligations for very large online platforms and search engines to manage systemic risks) (Article 56(2)).

The Commission will have powers to supervise and enforce the DSA other than those laid down in Section 5 of Chapter III thereof (Additional obligations for very large online platforms and search engines to manage systemic risks), against providers of VLOPs and of VLOSs (Article 56(3)).

Where the Commission has not initiated proceedings for the same infringement, the Member State in which the main establishment of the provider of VLOP or of VLOS is located will have powers to supervise and enforce the obligations under the DSA, other than those laid down in Section 5 of Chapter III, with respect to those providers (Article 56(4)).

Member States and the Commission must supervise and enforce the provisions of the DSA in close cooperation (Article 56(5)).

Where a provider of intermediary services does not have an establishment in the Union, the Member State where its legal representative resides or is established or the Commission, will have powers, as applicable, in accordance with Article 56(1) and Article 56(4), to supervise and enforce the relevant obligations under the DSA (Article 56(6)).

European Board for Digital Services (Board) (Art 61).

The Board is an independent advisory group on the supervision of providers of intermediary services and is established by Article 61(1). Amongst other obligations, it assists the Digital Services Coordinators and the Commission in the supervision of VLOPs (Article 61(2)).

Its structure is regulated by Article 62 and its tasks are set out in Article 63. Amongst the most important tasks in Article 63 are:

  1. support the competent authorities in the analysis of reports and results of audits of VLOPs and VLOSEs to be transmitted under the DSA

  2. issue opinions, recommendations or advice to Digital Services Coordinators taking into account, in particular, the freedom to provide services of the providers of intermediary service;

  3. advise the Commission on the measures referred to in Article 66 and, adopt opinions concerning VLOPs and VLOSEs in accordance with the DSA.

Digital Services Coordinators and, where applicable, other competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board, must provide the reasons for this choice, including an explanation on the investigations, actions and the measures that they have implemented, when reporting under the DSA or when adopting their relevant decisions, as appropriate.

Digital Services Committee (committee) (Art 88).

This committee is a committee within regulation (EU) 182/2011 and assists the Commission.

Out-of-court dispute settlement (Article 21)

Recipients of the service (which includes those that have submitted notices, addressed by the decisions referred to in Article 20(1) (complaints)) are entitled to select any out-of-court dispute settlement body that has been certified by the Digital Services Co-ordinator, in order to resolve disputes referred to in Article 21(1).

Providers of online platforms must ensure that information in relation to access to the Article 21(1) out-of-court dispute settlement, is easily accessible on their online interface, clear and user-friendly.

The Article 21(1) right is without prejudice to the right for the recipient of the service, to initiate, at any stage, proceedings to contest those decisions by the providers of online platforms before a court in accordance with the applicable law.

Both parties must engage, in good faith, with the selected certified out-of-court dispute settlement body with a view to resolving the dispute.

Providers of online platforms may refuse to engage with such out-of-court dispute settlement body if a dispute has already been resolved concerning the same information and the same grounds of alleged illegality or incompatibility of content. The certified out-of-court dispute settlement body must not have the power to impose a binding settlement of the dispute on the parties.

Article 21 sets out the role of the Digital Services Coordinator (DSC) including (i) the requirement for the DSC to certify an out of court dispute resolution body where the body meets all the Article 21(3) conditions (ii) what the DSC must certify where applicable (iii) when the DSC must revoke the certification; (iv) the requirement for the DSC every two years to draw up a report (the content determined by Article 21(4)) on the functioning of the out of court dispute resolution bodies that they have certified and (v) the requirement to notify to the Commission the out-of-court dispute settlement bodies that they have certified under Article 21(3).

Member States may establish out-of-court dispute settlement bodies for the purposes of Article 21(1) or support the activities of some or all out-of-court dispute settlement bodies that they have certified under Article 21(3).

Article 21 is without prejudice to Directive 2013/11/EU on alternative dispute resolution procedures and entities for consumers established under that Directive (Article 21(9)).

Entry into force and application (Article 93)

The DSA entered into force on the twentieth day following that of its publication in the Official Journal of the European Union.

The DSA will apply from 17 February 2024.

However, Article 24(2) (3) and (6) (Transparency reporting obligations for providers of online platforms) Article 33(3) to (6) (Very large online platforms and very large online search engines), Article 37(7) (Independent audit), Article 40(13) (Data access and scrutiny), Article 43 (Supervisory fee) and Sections 4, 5 and 6 of Chapter IV shall apply from 16 November 2022.

Chapter 4 relates to Implementation, Cooperation, Penalties and Enforcement.

Section 4  of Chapter 4 relates to Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engine.

Section 5 of Chapter 4 relates to common provisions on enforcement.

Section 6 of Chapter 4 relates to delegated and implementing acts.

Sections 4,5 and 6 of Chapter 4 comprise Articles 64 to 88 of the DSA.


Copyright © Paul Foley January 2023 - All Rights Reserved.

Owner, Paul Foley Law

For legal advice on and compliance with the Platforms Regulation, the Ranking Guidelines, the DSA, DMA and the EU Artificial Intelligence Act, please use the Contact page or Email: paul@paulfoleylaw.ie

Full copyright policy HERE >
map-markerenvelopetagarrow-left linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram