European Commission opens investigation into Snapchat’s child safety protections

EURONEWS.COM

The European Commission has launched an investigation to determine whether Snapchat exposed minors to grooming and criminal recruitment, potentially breaching EU digital safety laws.

ADVERTISING

ADVERTISING

Snapchat is a social media platform where users share photos and videos that typically disappear after being viewed. Approximately 94.5 million Europeans had a Snapchat account in 2025, according to the company’s latest transparency report.

The Commission said on Thursday that it suspects the platform might allow adults to masquerade as young users, who then contact children to recruit them for illegal activities or to exploit them sexually.

“From grooming and exposure to illegal products to account settings that undermine minors’ safety, Snapchat appears to have overlooked that the Digital Services Act demands high safety standards for all users,” said Henna Virkkunen, executive vice-president for tech sovereignty, security and democracy.

“With this investigation, we will look closely into their compliance with our legislation,” she added.

The investigation will fall under the Digital Services Act (DSA), following a review of the platform’s risk assessments from 2023 to 2025 and additional information it received last October about age verification and illegal activity.

The Commission’s announcement marks the start of formal proceedings, which could lead to “further enforcement” at a later stage. Snapchat may also propose changes to its policies and practices in response to the investigation.

Euronews Next reached out to Snap Inc., the parent company, for comment, but did not receive an immediate reply.

What else will the investigation examine?

The Commission will focus on five areas: age assurances, grooming and recruitment of minors for criminal activities, inadequate default account settings, dissemination of information on the sale of banned products, and reporting of illegal content.

This includes whether Snapchat users can buy illegal products, such as drugs, vapes and alcohol, through the platform due to insufficient content moderation that fails to limit videos with information on how and where to obtain them.

The Netherlands Authority for Consumers and Markets (ACM) mounted a similar investigation into the sale of vape products on Snapchat last September. The Commission said this probe will be incorporated into its broader investigation.

If a user spots illegal content on the platform, the Commission said it suspects that the reporting mechanisms in place “are neither easy to access nor user-friendly,” and that the company does not inform its users on how to file internal complaints.

It also suggests that Snapchat uses “dark patterns” in its design: deceptive techniques to trick users into making choices they might not otherwise make.

To create a Snapchat account, the company relies on users self-disclosing whether they are over 13, which the Commission said is “insufficient,” to keep children from accessing the platform.

Snapchat has “teen” accounts for children between the ages of 13 and 17 that have “additional layers of protection,” such as setting their accounts as “private” by default, which means young people can only be contacted by their friend list.

Teenagers have to opt in to having their location shared under “Snap Map,” with their friends, according to Snapchat.

However, the Commission argues that the app’s reliance on self-disclosure means that age-appropriate experiences are not being triggered when they need to be.

The app’s default account settings do not provide “sufficient privacy, safety and security protections for minors,” the Commission said.

It also noted that users may not be given guidance on privacy and security features when creating an account, nor are they informed how to adjust these settings.