EU proposes 16-year minimum age for social media and AI platforms

The European Parliament is calling for stronger online protection for children, proposing an EU-wide minimum age of 16 for accessing social media, video-sharing platforms and AI companions, while allowing 13–16-year-olds to access these services only with parental consent.

The proposal was adopted in a non-legislative report on Wednesday by 483 votes in favour, 92 against and 86 abstentions, citing increasing concerns about the physical and mental health risks minors face online. The report notes that manipulative platform design — including features that promote prolonged scrolling and constant interaction — can interfere with attention, behaviour and wellbeing.

Age verification and platform responsibility

Parliament supports the development of an EU age verification app and the European Digital Identity (eID) wallet, with a requirement that these systems be accurate and privacy-preserving. MEPs underline that age verification does not reduce the responsibility of platforms to ensure services are safe and age-appropriate by design.

To reinforce compliance with the Digital Services Act (DSA), the report suggests that senior managers could be held personally liable in cases of serious and persistent non-compliance, particularly where minors’ safety and age verification requirements are concerned.

Measures targeting addictive and manipulative digital features

The European Parliament is asking the Commission to advance stricter protections for minors, including:

  • Ban on the most harmful addictive practices, and default disabling of other addictive features for minors (infinite scrolling, autoplay, pull-to-refresh, reward loops, harmful gamification).
  • Bans on platforms that do not comply with EU child-safety rules.
  • Action against persuasive technologies such as targeted ads, influencer marketing, addictive design and dark patterns under the upcoming Digital Fairness Act.
  • Ban on engagement-based recommendation systems for minors.
  • Application of DSA protections to online video platforms, and prohibition of loot boxes and other randomised gaming features, including in-app currencies, fortune wheels and pay-to-progress models.
  • Measures to prevent commercial exploitation of minors, including restrictions on platforms offering financial incentives for kidfluencing.
  • Urgent action addressing risks linked to generative AI, including deepfakes, AI companionship chatbots, AI agents and synthetic nudity apps.
Public support for stricter child-safety rules

Research referenced in the report shows:

  • 97% of young people go online every day.
  • 78% of 13–17-year-olds check their devices at least hourly.
  • One in four minors displays problematic smartphone use similar to addiction.

According to the 2025 Eurobarometer, more than 90% of Europeans say stronger protection for children online is urgent — especially in relation to social media’s mental-health impact (93%), cyberbullying (92%) and the need to restrict access to age-inappropriate content (92%).

EU member states are also introducing measures including age-verification systems, educational initiatives, smartphone restrictions in schools, cyberbullying rules, helplines and awareness campaigns as part of national responses to emerging risks.

Outlook — What lies ahead

The EU’s next phase of online safety for minors will prioritise stronger enforcement of existing rules, as children increasingly enter digital spaces at younger ages. Policymakers are monitoring the rapid growth of generative AI, which intensifies risks such as deepfakes, grooming and misinformation. Sharenting — parents publicly sharing children’s personal information — is also emerging as a privacy concern.

Additional developments expected across Europe

  • Expansion of age-verification frameworks and smartphone restrictions in schools.
  • Wider deployment of cyberbullying reporting and prevention mechanisms.
  • Continued action on commercial exploitation and kidfluencing regulations.
  • Growth in public awareness and digital-wellbeing campaigns.
  • A planned EU-wide inquiry into social media’s impact on wellbeing and an EU action plan against cyberbullying.

Although the EU already has child-safety safeguards through the DSA, GDPR and audiovisual content rules, consistent enforcement remains difficult and requires more coordination, resources and reliable age assurance across platforms.

Source


Related Post