The European Commission has launched a formal investigation into Snapchat, suspecting the platform fails to adequately protect minors from online exploitation, grooming, and criminal recruitment. This probe, initiated under the Digital Services Act (DSA), could result in substantial penalties or mandatory changes to Snapchat’s policies.
The Core of the Investigation
The Commission’s concerns center around five key areas:
- Age Verification: Snapchat relies heavily on self-reporting for age, a system the EU deems “insufficient” to keep children off the platform.
- Grooming & Recruitment: The investigation will examine whether Snapchat allows predators to pose as young users to target children. This is a major issue because predators actively seek platforms where minors are present, and lax verification makes this easy.
- Default Settings: Snapchat’s default settings are accused of not providing adequate privacy and safety for young users. The app’s design may not prioritize the protection of minors, leaving them vulnerable to exploitation.
- Illegal Products: The Commission will scrutinize whether Snapchat facilitates the sale of banned goods like drugs, vapes, and alcohol through inadequate content moderation.
- Reporting Mechanisms: The EU suspects Snapchat’s reporting system for illegal content is “neither easy to access nor user-friendly,” making it difficult for users to flag harmful activity.
Why This Matters: The DSA and Digital Safety
This investigation is significant because it highlights the growing enforcement of the Digital Services Act (DSA). The DSA aims to hold large online platforms accountable for illegal and harmful content, especially regarding vulnerable populations like children.
The DSA is critical because it shifts the burden of responsibility onto platforms. Previously, enforcement was fragmented and slow. Now, the EU has the power to investigate and impose significant fines if companies fail to comply. Snapchat has roughly 94.5 million European users, making it a high-priority target for regulators.
Snapchat’s Response
Snapchat claims it prioritizes safety and has cooperated with the Commission. The company argues its design focuses on close connections and built-in privacy. However, the EU is skeptical, citing concerns about deceptive design practices (“dark patterns”) and inadequate age verification.
Snapchat has “teen” accounts with extra protections, but the Commission argues these are undermined by the reliance on self-disclosure. The company insists it will continue to cooperate, but the investigation suggests a fundamental disconnect between Snapchat’s claims and the EU’s assessment of its safety measures.
What Comes Next?
The Commission’s investigation will review Snapchat’s risk assessments from 2023-2025 and additional data from October 2023. The company may be forced to change its policies and practices or face further enforcement action. The Dutch Authority for Consumers and Markets (ACM) has already investigated similar issues on Snapchat, and its findings will be integrated into the EU probe.
This investigation signals a broader trend: regulators are no longer willing to accept platforms’ self-regulation. The DSA gives them the tools to enforce real change, and Snapchat is the latest example of a company facing increased scrutiny.
The outcome will set a precedent for other social media platforms and demonstrate whether the EU can effectively protect children online.















































