Digital Services Act (DSA)
The DSA applies to intermediary services offered to recipients of the service that have their place of establishment or are located in the Union, irrespective of where the providers of those intermediary services have their place of establishment.
The DSA lays down harmonised rules on the provision of intermediary services, sets conditional exemptions from liability for information society services providers (which are dealt with at the end of this article) and imposes due diligence obligations on providers of intermediary services. The obligations differ depending on the type of intermediary services that are provided. Hence the definitions are important. Finally the DSA establishes rules on the implementation and enforcement of the DSA, including as regards the cooperation of and coordination between the competent authorities.
In the DSA:
an intermediary service means one of the following information society services: (i) a mere conduit service, (ii) a caching service; (iii) a hosting service (each as defined in Article 3).
an online platform is a hosting service. An online search engine is an intermediary service.
recipient of the service means any natural or legal person who uses an intermediary service, in particular for the purposes of seeking information or making it accessible.
Providers of intermediary services must (extracts only):
Providers of intermediary services must make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content moderation that they engaged in during the relevant period. Those reports must include, in particular, information on the following, as applicable:
Article 15(1) will not apply to providers of intermediary services that qualify as micro or small enterprises as defined in Recommendation 2003/361/EC and which are not very large online platforms within the meaning of Article 33 of the DSA (Article 15(2)).
The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports under Article15(1), including harmonised reporting periods. Those implementing acts must be adopted in accordance with the advisory procedure referred to in Article 88 (Article 15(3)).
Providers of hosting services must:
Providers of online platforms must provide access to a complaints handling system, to recipients of the service, including individuals or entities that have submitted a notice (under Article 16) against decisions (see just below) taken by the provider in relation to content, on the grounds that the information provided by the recipients is illegal or in breach of the provider terms and conditions:
The access period must be for at least six months following any decision referred to just above. The complaints handling system must amongst other requirements, be easy to use (Article 20(3). Article 21(4) sets out when a provider must reverse its decision taken under Article 20(1).
The period of six months starts on the date the recipient of the service is informed about the decision in accordance with Article 16(5) or Article 17.
Providers must inform complainants without undue delay of their reasoned decision in respect of the information to which the complaint relates and of the possibility of out-of-court dispute settlement provided for in Article 21 (see at the end of this Article) and other available possibilities for redress.
Providers of online platforms must also:
The prohibition in Article 25(1) will not apply to practices covered by Directive 2005/29/EC or Regulation (EU) 2016/679 (GDPR) (Article 25(2)). The Commission may issue guidelines on how Article 25(1) applies to specific practices (Article25(3)).
The status of ‘trusted flagger’ under the DSA will be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:
Trusted flaggers must publish, at least once a year easily comprehensible and detailed reports on notices submitted in accordance with Article 16 (Notice and Action Mechanisms) during the relevant period. The report must list at least the number of notices categorised by: (a) the identity of the provider of hosting services, (b) the type of allegedly illegal content notified, (c) the action taken by the provider. Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence. Trusted flaggers must send those reports to the awarding Digital Services Coordinator, and must make them publicly available. The information in those reports must not contain personal data (Article 22(3)).
In addition to the information referred to in Article 15, providers of online platforms must include in the reports referred to in that Article information on the following:
By 17 February 2023 and at least once every six months thereafter, providers must publish for each online platform or online search engine, in a publicly available section of their online interface, information on the average monthly active recipients of the service in the Union, calculated as an average over the period of the past six months and in accordance with the methodology laid down in the delegated acts referred to in Article 33(3), where those delegated acts have been adopted (Article 24(2))
Providers of online platforms or of online search engines must communicate to the Digital Services Coordinator of establishment and the Commission, upon their request and without undue delay, the information referred to in Article 24 (2) updated to the moment of such request. That Digital Services Coordinator or the Commission may require the provider of the online platform or of the online search engine to provide additional information as regards the calculation referred to in that paragraph, including explanations and substantiation in respect of the data used. That information must not include personal data (Article 24(3)).
When the Digital Services Coordinator of establishment has reasons to consider, based on the information received pursuant to Article 24(2) and (3), that a provider of online platforms or of online search engines meets the threshold of average monthly active recipients of the service in the Union laid down in Article 33(1), it must inform the Commission thereof (Article 24(4)).
Providers of online platforms must, without undue delay, submit to the Commission the decisions and the statements of reasons referred to in Article 17(1) for the inclusion in a publicly accessible machine-readable database managed by the Commission. Providers of online platforms must ensure that the information submitted does not contain personal data (Article 24(5)).
The Commission may adopt implementing acts to lay down templates concerning the form, content and other details of reports pursuant to Article 24 (1). Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 88 (Article 24(6)).
Providers of online platforms must:
Providers of online platforms that present advertisements on their online interfaces must:
Profiling means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements. (Article 4, point (4), of Regulation (EU) 2016/679).
Recommender system means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed (Article3(s)).
Providers of online platforms that use recommender systems must set out in their terms and conditions, in plain and intelligible language, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters (Article 27(1)).
The main parameters referred to in Article 27(1) must explain why certain information is suggested to the recipient of the service. They shall include, at least: (a) the criteria which are most significant in determining the information suggested to the recipient of the service; (b) the reasons for the relative importance of those parameters (Article 27(2)).
Where several options are available pursuant to article 27(1) for recommender systems that determine the relative order of information presented to recipients of the service, providers of online platforms must also make available a functionality that allows the recipient of the service to select and to modify at any time their preferred option. That functionality must be directly and easily accessible from the specific section of the online platform’s online interface where the information is being prioritised (Article 27(3)).
Providers of online platforms accessible to minors must put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service (Article 28(1)).
Providers of online platforms must not present advertisements on their interface, based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 (profiling) using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor (Article 28(2)).
Compliance with the obligations set out in this Article must not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor (Article 28(3)). The Commission, after consulting the Board, may issue guidelines to assist providers of online platforms in the application of Article 28(1) (Article 28(4)).
Providers of online platforms allowing consumers to conclude distance contracts with traders must ensure that traders can only use those online platforms to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of their services for those purposes, they have obtained specified information from the trader (not reproduced here).
The Article 30 provision (Article 30(1) to 30(7) inclusive) (not reproduced here) requires detailed review as it impacts the functionality of the online platform, and the terms of business. The provision is in line with the approach taken by the EU Commission in the Market Surveillance Regulation.
Providers of online platforms allowing consumers to conclude distance contracts with traders must:
After allowing the trader to offer products or services on its online platform that allows consumers to conclude distance contracts with traders, the provider must make reasonable efforts to randomly check in any official, freely accessible and machine-readable online database or online interface whether the products or services offered have been identified as illegal (Article 31(3)).
Where a provider of an online platform allowing consumers to conclude distance contracts with traders, becomes aware, irrespective of the means used, that an illegal product or service has been offered by a trader to consumers located in the Union through its services, that provider must inform, insofar as it has their contact details, consumers who purchased the illegal product or service through its services of the following: (a) the fact that the product or service is illegal; (b) the identity of the trader; and (c) any relevant means of redress.
The obligation laid down in the first subparagraph will be limited to purchases of illegal products or services made within the six months preceding the moment that the provider became aware of the illegality (Article 32(1)).
Where, under Article 32(1), the provider of the online platform allowing consumers to conclude distance contracts with traders, does not have the contact details of all consumers concerned, that provider shall make publicly available and easily accessible on its online interface the information concerning the illegal product or service, the identity of the trader and any relevant means of redress (Article 32(2)).
Section 5 of the DSA i applies toonline platforms and online search engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as VLOP or VLOSE under Article 33(4) (Article 33(1)).
The Commission must adopt delegated acts to adjust the number of average monthly active recipients of the service in the Union where the Union’s population increases or decreases at least by 5% in relation to its population in 2020 or its population after adjustment by means of a delegated act in the year in which the latest delegated act was adopted. In such a case, it must adjust the number so that it corresponds to 10% of the Union’s population in the year in which it adopts the delegated act, rounded up or down to allow the number to be expressed in million (Article 33(2)).
The Commission may adopt delegated acts, after consulting the Board (European Board for Digital Services), to supplement the provisions of the DSA by laying down the methodology for calculating the number of average monthly active recipients of the service in the Union (Article 33(3)).
Providers of VLOPs or VLOSEs are designated by the Commission using the procedure in Article 33(4) (not reproduced here).
The Commission must terminate the designation if, during an uninterrupted period of one year, the online platform or the online search engine does not have a number of average monthly active recipients of the service equal to or higher than the number referred to in Article 33(1) (Article 33(5)).
Providers of VLOPS and VLOSES must
Providers of VLOPs and VLOSEs must:
This risk assessment must be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:
The risk assessment must include the following systemic risks (a) the dissemination of illegal content (b) any actual or foreseeable negative effects for the exercise of fundamental rights (c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security; (d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being (Article 34(1)).
Providers of VLOPs and VLOSEs must put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. What such measures may include are set out in detail in Article 35.
Where extraordinary circumstances lead to a serious threat to public security or public health in the EU or significant parts of it (a crisis), the Commission, acting upon a recommendation of the Board may adopt a decision, requiring one or more providers of VLOPs or VLOSEs to take one or more of the following actions:
When taking the decision referred to in Article 36(1), the Commission must amongst other things, allow a reasonable time for them to be taken and must limit the period in which they must be taken to three months (extendable by a further three months). The Commission must publish any notices it issues that require such steps to be taken.
Providers of VLOPs and VLOSEs will be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following: (a) the obligations set out in Chapter III; (b) any commitments undertaken pursuant to the codes of conduct referred to in Articles 45 and 46 and the crisis protocols referred to in Article 48 (Article 37(1)) (these Articles are not reproduced in this Article).
Article 37(2) requires the organisation undertaking the audit to be afforded the necessary cooperation and assistance by the VLOPs and VLOSEs.
Article 37(3) sets out the requirements for the organisation performing the audit.
Providers of VLOPs and VLOSEs must ensure that the organisations that perform the audits establish an audit report for each audit. The report must contain the information required by Article 37(4).
Where the organisation performing the audit was unable to audit certain specific elements or to express an audit opinion based on its investigations, the audit report shall include an explanation of the circumstances and the reasons why those elements could not be audited (Article 37(5)).
Providers of VLOPs or VLOSEs receiving an audit report that is not ‘positive’ must take due account of the operational recommendations addressed to them with a view to take the necessary measures to implement them. They must , within one month from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they must justify in the audit implementation report the reasons for not doing so and set out any alternative measures that they have taken to address any instances of non-compliance identified (Article 37(6)).
The Commission is empowered to adopt delegated acts (Article 87) to supplement the DSA by laying down the necessary rules for the performance of the audits pursuant to Article 37 (not summarised here), in particular as regards the necessary rules on the procedural steps, auditing methodologies and reporting templates for the audits performed pursuant to this Article. Those delegated acts must take into account any voluntary auditing standards referred to in Article 44(1), point (e) (Art 36(7)).
Providers of VLOPs or VLOSEs that present advertisements on their online interfaces must
The repository must include at least all of the following information:
In addition to Article 27, providers of VLOPs or VLOSEs that use recommender systems must provide at least one option for each of their recommender systems which is not based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 (GDPR).
Providers of VLOPs or VLOSEs must provide the Digital Services Coordinator of establishment or the Commission, at their reasoned request and within a reasonable period specified in that request, access to data that are necessary to monitor and assess compliance with the DSA (Article 40(1)).
For the purposes of Article 40(1), providers of VLOPs or VLOSEs must, at the request of either the Digital Service Coordinator of establishment or of the Commission, explain the design, the logic, the functioning and the testing of their algorithmic systems, including their recommender systems (Article 40(3)).
Upon a reasoned request from the Digital Services Coordinator of establishment, providers of VLOPs or VLOSEs must within a reasonable period, as specified in the request, provide access to data to vetted researchers who meet the requirements in Article 40(8), for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union, as set out pursuant to Article 34(1), and to the assessment of the adequacy, efficiency and impacts of the risk mitigation measures pursuant to Article 35 (not reproduced here) (Article 40(4)).
Upon a duly substantiated application from researchers, the Digital Services Coordinator of establishment must grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of VLOPs or of VLOSEs a pursuant to paragraph 4 (Article 40(4)), where the researchers demonstrate that they meet all of the following conditions (cont below):
Providers of VLOPs and VLOSEs must establish a compliance function. Amongst other requirements, the head of the compliance function must report directly to the management body of the provider of the VLOP or VSOE as the case may be.
Providers of VLOPs and VLOSEs must publish the reports referred to in Article 15 at the latest by two months from the date of application referred to in Article 33(6), second subparagraph, and thereafter at least every six months (Art 42(1)). The reports must, in addition to the information referred to in Article 15 and Article 24(1), specify:
The reports must be published in at least one of the official languages of the Member States.
In addition to the information referred to in Articles 24(2), the providers of VLOPS or of VLOSEs must include in the reports referred to in Article 42(1) the information on the average monthly recipients of the service for each Member State.
The conditions under which providers of information society services being mere conduit (Article 4), caching (automatic, intermediate and temporary storage)(Article 5) and hosting services (Article 6) are exempt from liability for third-party information are set out in those Articles. The specific wording of each Article and Articles 7 to 10 below) require detailed review.
Illegal content as used in the DSA means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law. Further detail on what constitutes illegal content is provided elsewhere in the DSA.
Providers of intermediary services will not be deemed ineligible for the exemptions from liability referred to in Articles 4, 5 and 6 solely because they, in good faith and in a diligent manner, carry out voluntary own-initiative investigations into, or take other measures aimed at detecting, identifying and removing, or disabling access to, illegal content, or take the necessary measures to comply with the requirements of Union law and national law in compliance with Union law, including the requirements set out in the DSA.
No general obligation to monitor the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity shall be imposed on those providers.
Upon the receipt of an order to act against one or more specific items of illegal content, issued by the relevant national judicial or administrative authorities, providers of intermediary services must inform the authority issuing the order, or any other authority specified in the order, of any effect given to the order without undue delay, specifying if and when effect was given to the order.
Upon receipt of an order to provide specific information about one or more specific individual recipients of the service, issued by the relevant national judicial or administrative authorities, providers of intermediary services must, without undue delay inform the authority issuing the order, or any other authority specified in the order, of its receipt and of the effect given to the order, specifying if and when effect was given to the order.
The Digital Services Coordinator is the entity responsible for all matters relating to supervision and enforcement of the DSA in a Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities (Art 49 extract).
Member States must designate one or more competent authorities to be responsible for the supervision of providers of intermediary services and enforcement of the DSA (‘competent authorities’) and by 17 February 2024. designate one of them, as their Digital Services Coordinator (Art 49 extract).
The provisions applicable to Digital Services Coordinators set out in Articles 50, 51 and 56 will also apply to any other competent authorities that the Member States designate under Article 49(1) (Art 49).
Briefly, the role of Digital Services Coordinators are specified in Article 50 (Requirements for Digital Services Coordinators) Article 51; (Powers of Digital Services Coordinators); Article 55 (Activity reports); Article 56 (Division of Competencies); Article 58 (Cross-border cooperation among Digital Services Coordinators); and Article 59 (Referral to the Commission). However, there are additional obligations for Digital Services Coordinators contained throughout the DSA.
The Member State in which the main establishment of the provider of intermediary services is located will have exclusive powers to supervise and enforce the DSA except for the powers provided for in Art 56(2), Art 56(3) and Art 56(4) (Article 56(1))
The Commission will have exclusive powers to supervise and enforce Section 5 of Chapter III (Additional obligations for very large online platforms and search engines to manage systemic risks) (Article 56(2)).
The Commission will have powers to supervise and enforce the DSA other than those laid down in Section 5 of Chapter III thereof (Additional obligations for very large online platforms and search engines to manage systemic risks), against providers of VLOPs and of VLOSs (Article 56(3)).
Where the Commission has not initiated proceedings for the same infringement, the Member State in which the main establishment of the provider of VLOP or of VLOS is located will have powers to supervise and enforce the obligations under the DSA, other than those laid down in Section 5 of Chapter III, with respect to those providers (Article 56(4)).
Member States and the Commission must supervise and enforce the provisions of the DSA in close cooperation (Article 56(5)).
Where a provider of intermediary services does not have an establishment in the Union, the Member State where its legal representative resides or is established or the Commission, will have powers, as applicable, in accordance with Article 56(1) and Article 56(4), to supervise and enforce the relevant obligations under the DSA (Article 56(6)).
The Board is an independent advisory group on the supervision of providers of intermediary services and is established by Article 61(1). Amongst other obligations, it assists the Digital Services Coordinators and the Commission in the supervision of VLOPs (Article 61(2)).
Its structure is regulated by Article 62 and its tasks are set out in Article 63. Amongst the most important tasks in Article 63 are:
Digital Services Coordinators and, where applicable, other competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board, must provide the reasons for this choice, including an explanation on the investigations, actions and the measures that they have implemented, when reporting under the DSA or when adopting their relevant decisions, as appropriate.
This committee is a committee within regulation (EU) 182/2011 and assists the Commission.
Recipients of the service (which includes those that have submitted notices, addressed by the decisions referred to in Article 20(1) (complaints)) are entitled to select any out-of-court dispute settlement body that has been certified by the Digital Services Co-ordinator, in order to resolve disputes referred to in Article 21(1).
Providers of online platforms must ensure that information in relation to access to the Article 21(1) out-of-court dispute settlement, is easily accessible on their online interface, clear and user-friendly.
The Article 21(1) right is without prejudice to the right for the recipient of the service, to initiate, at any stage, proceedings to contest those decisions by the providers of online platforms before a court in accordance with the applicable law.
Both parties must engage, in good faith, with the selected certified out-of-court dispute settlement body with a view to resolving the dispute.
Providers of online platforms may refuse to engage with such out-of-court dispute settlement body if a dispute has already been resolved concerning the same information and the same grounds of alleged illegality or incompatibility of content. The certified out-of-court dispute settlement body must not have the power to impose a binding settlement of the dispute on the parties.
Article 21 sets out the role of the Digital Services Coordinator (DSC) including (i) the requirement for the DSC to certify an out of court dispute resolution body where the body meets all the Article 21(3) conditions (ii) what the DSC must certify where applicable (iii) when the DSC must revoke the certification; (iv) the requirement for the DSC every two years to draw up a report (the content determined by Article 21(4)) on the functioning of the out of court dispute resolution bodies that they have certified and (v) the requirement to notify to the Commission the out-of-court dispute settlement bodies that they have certified under Article 21(3).
Member States may establish out-of-court dispute settlement bodies for the purposes of Article 21(1) or support the activities of some or all out-of-court dispute settlement bodies that they have certified under Article 21(3).
Article 21 is without prejudice to Directive 2013/11/EU on alternative dispute resolution procedures and entities for consumers established under that Directive (Article 21(9)).
The DSA entered into force on the twentieth day following that of its publication in the Official Journal of the European Union.
The DSA will apply from 17 February 2024.
However, Article 24(2) (3) and (6) (Transparency reporting obligations for providers of online platforms) Article 33(3) to (6) (Very large online platforms and very large online search engines), Article 37(7) (Independent audit), Article 40(13) (Data access and scrutiny), Article 43 (Supervisory fee) and Sections 4, 5 and 6 of Chapter IV shall apply from 16 November 2022.
Chapter 4 relates to Implementation, Cooperation, Penalties and Enforcement.
Section 4 of Chapter 4 relates to Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engine.
Section 5 of Chapter 4 relates to common provisions on enforcement.
Section 6 of Chapter 4 relates to delegated and implementing acts.
Sections 4,5 and 6 of Chapter 4 comprise Articles 64 to 88 of the DSA.
Copyright © Paul Foley January 2023 - All Rights Reserved.
Owner, Paul Foley Law
For legal advice on and compliance with the Platforms Regulation, the Ranking Guidelines, the DSA, DMA and the EU Artificial Intelligence Act, please use the Contact page or Email: email@example.com