Informative Note

Guidelines for the protection of minors under the Digital Services Act

12/06/2025

Background

  • On 13 May 2025, the European Commission submitted draft guidelines on the protection of minors online[1] under the Digital Services Act (DSA) [2].
  • These guidelines set out the measures that the Commission believes should be implemented by online platforms accessible to minors (with the exception of micro and small businesses) to ensure the protection, security and privacy of these users.
  • The development of the project involved several stakeholders, including minors, through the Better Internet for Kids (BIK+) initiative. The Commission also collaborated with the European Digital Services Committee, particularly the Working Group on the Protection of Minors.
RELATIONSHIP WITH THE DSA
  • According to the Commission guidelines (based on DSA Recital 71), a digital platform is considered “accessible to minors” if it meets one of the following criteria: (i) its terms and conditions allow minors to register or access the service; (ii) the service is aimed at young audiences, or is ultimately used by minors; or (iii) the provider knows that some of its users are minors because it collects and processes data (e.g. date of birth) that reveal their age.
  • This classification applies to all online platforms whose services are accessible to minors, including Very Large Online Platforms (VLOPs) and medium-sized enterprises. The only exceptions are micro and small businesses that have not been designated as VLOPs. Indeed, the draft guidelines reinforce the fact that meeting any of the above criteria is sufficient to trigger the obligation for platforms to adopt the protection measures set out in Article 28(1) of the DSA. The Commission also illustrates each scenario with practical, albeit fictitious, examples. These include pornography sites that do not technically prevent minors from accessing them, platforms that process age data for other purposes, or services known to attract minors.
  • Under DSA Article 28(1), online platforms accessible to minors must implement appropriate and proportionate measures to ensure the privacy, security and protection of these users.
  • In accordance with Article 28(4), the European Commission may issue guidelines to support platforms in implementing the measures referred to in Article 28(1), after consulting the European Data Protection Council.
DRAFT GUIDELINES

General principles

In accordance with the provisions of Article 28(1) of the DSA, the Commission considers that measures adopted by online platforms accessible to minors should comply with the following general principles:

  • Proportionality – A proportionality assessment should be conducted on a case-by-case basis, considering the specific risks to privacy, security, and the protection of minors posed by the platform, as well as the impact of the measure on the rights enshrined in the Charter of Fundamental Rights of the European Union.
  • Children’s rights – The Commission recommends that measures adopted respect the rights of children enshrined in the United Nations Convention on the Rights of the Child and the Charter of Fundamental Rights of the European Union, in order for them to be considered appropriate and proportionate. These rights include non-discrimination, access to information, and freedom of expression for children.
  • Privacy safety – and security-by-design – Providers of online platforms accessible to minors should integrate the highest standards of privacy, safety and security in the design, development and operation of their services.
  • Age-appropriate design – Providers of online platforms accessible to minors should design their services to align with the developmental, cognitive, and emotional needs of minors
RISK REVIEW

According to the Commission, providers of online platforms accessible to minors should carry out a risk review to determine: (i) the likelihood of minors accessing the platform; (ii) the risks that the platform poses to minors; (iii) the measures already taken (or to be adopted) to mitigate these risks; (v) the impact of these measures on the fundamental rights of minors.

The guidelines provide examples of the measures that can be implemented. These include: (i) age verification mechanisms to reduce the risk of children being exposed to inappropriate content (e.g. pornography or gambling) or grooming practices; (ii) automatic settings on minors’ accounts to minimise the risk of them contacting strangers or disclosing personal data.

CONCRETE MEASURES

1. Age verification

In the Commission’s view, age-based access restrictions are an effective means of ensuring the privacy, security and protection of minors on online platforms. This is particularly true when they are used to prevent access to inappropriate content. These mechanisms are divided into three categories:

  1. Age verification can be carried out using physical identifiers or certified sources, such as electronic identification documents issued by Member State authorities under the eIDAS Regulation. Alternatively, until the EU Digital Identity Wallet is fully available, age verification can be carried out using the provisional solution, the ‘EU age verification solution’[3] This mechanism ensures a high degree of certainty in determining the age of the user. It is used to automatically restrict access to highly sensitive content, such as pornography, gambling platforms, and alcohol and tobacco sales, which are exclusive to over 18s.
  2. Age estimation – It allows the digital service provider to establish that a user is likely to be of a certain age, to fall within a certain age range, or to be over or under a certain age. This is used to restrict access to content intended exclusively for people aged 18 or over.
  3. Self-declaration of age – The user provides information on their age. The Commission believes that this mechanism does not provide adequate age verification and should therefore be used alongside one of the other verification methods.

2. Automatic settings for minors’ accounts

According to the Commission, the accounts of minors on online platforms must automatically and by default feature configurations, designed to ensure a high level of privacy, security and protection.

Among these configurations, we would like to highlight the following: (i) restricting interactions to contacts that have been previously accepted by the minor in order to reduce the risk of them coming into contact with strangers; and (ii) prohibiting the use of filters that could have a harmful effect on their body image, self-esteem, and mental health. In addition, the accounts of minors must be private by default and features such as geolocation, camera access and microphone access must be disabled.

NEXT STEPS

The draft guidelines are currently subject to a public consultation. Stakeholders, including children and young people, parents and guardians, and digital service providers, can submit their comments until 10 June 2025.

Following the public consultation, the Commission intends to publish the guidelines before summer 2025. The aim is to enhance the security of the digital environment for minors, who are increasingly exposed to risks such as cyberbullying and accessing age-inappropriate content.

Downloads

Keep up to date

Please note, your browser is out of date.
For a good browsing experience we recommend using the latest version of Chrome, Firefox, Safari, Opera or Internet Explorer.