Snapchat Faces EU Child Safety Investigation, Exposing Gaps in Age Assurance and Privacy Protections
The European Commission has initiated a formal investigation into Snapchat under the Digital Services Act (DSA), focusing on the platform’s compliance with child protection obligations. This scrutiny aims to determine whether Snapchat is adequately ensuring the safety, privacy, and security of minors using its services. The DSA establishes stringent standards for online platforms operating within the European Union, with potential penalties reaching up to 6% of a company’s global annual revenue for non-compliance.
Age Assurance Under Digital Services Act Scrutiny
Central to the investigation is Snapchat’s method of age assurance. The platform stipulates that users must be at least 13 years old to create an account. However, the European Commission has expressed concerns regarding Snapchat’s reliance on self-declaration, suggesting it may not effectively prevent children under 13 from accessing the platform. Additionally, the Commission questions whether this approach sufficiently verifies the ages of users under 17, which is crucial for providing age-appropriate experiences.
The investigation also highlights potential inadequacies in the app’s reporting mechanisms for underage users, which may not be easily accessible. Furthermore, there are apprehensions about the risks minors face, including exposure to grooming attempts and recruitment for criminal activities. The Commission suspects that Snapchat may not be taking adequate measures to prevent users with malicious intentions from contacting children, particularly in instances where individuals misrepresent their age or manipulate their profiles.
Default Settings and Privacy Concerns
Another significant aspect of the DSA investigation revolves around Snapchat’s default account settings. The Commission believes that the platform may not offer sufficient privacy, safety, and security protections for minors by default. Features such as the “Find Friends” system, which recommends users, and push notifications that remain enabled without user consent are under scrutiny.
Moreover, the Commission has noted that users may not receive adequate guidance during account creation on managing their privacy and safety settings. This lack of clear instructions could hinder users, particularly minors, from effectively adjusting their settings to enhance their online safety.
Illegal Content and Reporting Mechanisms Under Review
The investigation further examines Snapchat’s effectiveness in preventing the dissemination of illegal content, including information related to the sale of drugs and age-restricted products like alcohol and vapes. Under the DSA, platforms are mandated to mitigate systemic risks arising from their services. The Commission suspects that Snapchat’s current content moderation measures may be insufficient to block or limit access to such content, especially for younger users.
Additionally, the reporting mechanisms for illegal content are being evaluated. Concerns have been raised that these systems may not be user-friendly or easily accessible, potentially discouraging users from reporting violations. There are also worries that users may not be adequately informed about complaint procedures or available redress options within the platform.
Next Steps in DSA Child Protection Investigation
The European Commission will conduct a comprehensive investigation, gathering further evidence by requesting information from Snapchat and conducting interviews or inspections. The initiation of formal proceedings allows the Commission to take additional enforcement actions, including adopting interim measures or issuing non-compliance decisions. It can also accept commitments from Snapchat to address the identified issues.
This action against Snapchat is part of broader regulatory efforts under the DSA to enhance online child protection across platforms. The Commission has utilized its 2025 DSA Guidelines on the protection of minors as a benchmark for evaluating compliance, emphasizing that self-declaration should not be regarded as a reliable method for age assurance. Default settings must provide the highest level of protection for minors.
Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy, stated, “From grooming and exposure to illegal products to account settings that undermine minors’ safety, Snapchat appears to have overlooked that the Digital Services Act demands high safety standards for all users. With this investigation, we will closely look into their compliance with our legislation.”
Age Verification Under Question
In a related development, the European Commission has also taken preliminary action against adult content platforms, including Pornhub, Stripchat, XNXX, and XVideos, under the DSA. The Commission found that these platforms may have inadequately protected minors from accessing pornographic content. Their risk assessments reportedly failed to sufficiently identify or evaluate risks to children, often prioritizing business considerations over child safety.
Virkkunen remarked, “In the EU, online platforms have a responsibility. Children are accessing adult content at increasingly younger ages, and these platforms must put in place robust, privacy-preserving, and effective measures to keep minors off their services. Today, we are taking another action to enforce the DSA – ensuring that children are properly protected online, as they have the right to be.”
The findings indicate that these platforms heavily rely on self-declaration for age verification, which the Commission deems ineffective. Measures such as content warnings, page blurring, or “restricted to adults” labels have been criticized for being insufficient to prevent minors from accessing harmful material. The Commission has suggested that more robust, privacy-preserving age verification methods are essential to address these risks.
As part of ongoing proceedings, these platforms will have the opportunity to respond to the Commission’s findings and implement corrective measures. If breaches are confirmed, the Commission may issue a non-compliance decision, potentially resulting in substantial financial penalties or enforcement actions to ensure compliance.
The broader enforcement initiative reflects a clear regulatory direction under the DSA, with authorities focusing on ensuring that platforms, regardless of size, assume greater responsibility for protecting minors online.
According to publicly available reporting, the European Commission’s actions signal a significant shift in how online platforms are held accountable for child safety and privacy protections. The implications of this investigation could resonate throughout the tech industry, prompting other platforms to reassess their age verification and privacy measures.
For the latest cybersecurity developments, threat intelligence and breaking updates from across the Middle East: Middle East


