EU Investigates Snapchat, YouTube, Apple, and Google Over Safeguards for Minors Under Digital Services Act

The European Commission has launched an investigation into how major online platforms — Snapchat, YouTube, the Apple App Store, and Google Play — are protecting minors online, as part of the enforcement of the Digital Services Act (DSA).

The Commission is demanding detailed information on age verification systems and measures to prevent minors from accessing illegal or harmful content, including drugs, vapes, and material promoting eating disorders.

Henna Virkkunen, Executive Vice-President for Tech Sovereignty, said: “We will do what it takes to ensure the physical and mental well-being of children and teens online. Platforms have the obligation to ensure minors are safe on their services — be it through measures included in the guidelines on protection of minors, or equally efficient measures of their own choosing.”

The inquiry follows the Guidelines on the Protection of Minors under the DSA, marking the first set of investigative actions to evaluate compliance by major digital platforms.

Focus Areas of the EU Investigation

Snapchat has been asked to explain how it prevents children under 13 from accessing its services — as prohibited by its own terms — and how it blocks the sale of illegal goods like vapes and drugs.

YouTube is required to disclose details on its age assurance and recommender systems, following concerns about minors being exposed to harmful video content.

Apple App Store and Google Play must provide information on how they screen apps for illegal or harmful content, including gambling or “nudify” apps, and how they enforce age ratings to protect young users.

What are the current measures?

Snapchat

Snapchat has set a minimum age requirement: Users must be at least 13 years old to create an account. Accounts belonging to children under that are locked (violating the rules) and eventually deleted.

Age verification: If Snapchat suspects a user is underage (or a user claims to be older), they can verify via government photo ID or via a parent verifying the young person’s age.

Account immutability of birthdate for teens: Teens aged 13-17 are prevented from changing their birth year to 18+. This helps avoid bypassing age-based protections.

Teen safety settings for interaction and connectivity: Teens have additional controls over who can contact them (e.g. restricting connections to friends or real-life contacts), reducing exposure to strangers. Also, warnings or pop-ups when someone suspicious tries to reach out.

Safety reporting: There are options for teens or parents to report safety concerns, both inside the app and via support pages.

YouTube

YouTube says there is a “supervised experience” for pre-teens and teens, which allows parents to set content filters and limit what a young user can do (e.g. comment, upload etc.).

Account linking / parental insights: Parents can link their account with a teen’s account through YouTube’s Family Center. This gives them visibility into certain activities of the teen — such as what channels they subscribe to, uploads, comments, etc.

Age-estimation / AI tools: YouTube is rolling out machine-learning / AI-based tools to better identify when users are minors (i.e. under 18), even if they declared another age. Once identified, extra protections apply (e.g. reducing recommendations of sensitive content, disabling personalized ads, enabling digital wellbeing features). If the AI-estimate is wrong, users can verify with an ID, credit card, selfie, etc.

Content settings and limits: For supervised / youth accounts, there are different content settings (e.g. “Explore”, “Explore More”, “Most of YouTube”) that limit exposure to content based on age, restrict ability to comment or livestream, etc.

Apple App Store

App Store says it has age ratings & content restriction system: Apple uses age ratings for apps (e.g. 4+, 9+, etc.) and is updating its system to have more granular tiers for teens (13+, 16+, 18+). These ratings help determine what apps are visible or downloadable for minors according to parental / device settings.

Child / teen account setup + parental consent: For users under a certain age, Apple requires parental consent (via family sharing etc.), limits on what such child accounts can do, as well as automatic enabling of restrictions (web content filters, app restrictions).

App Store UI / product page transparency: On app product pages, Apple flags if an app has user-generated content, messaging, or advertising, and whether in-app content controls (parental controls, age assurance) are included.

Parental controls & family tools: Features like Ask to Buy (parents must approve purchases or app downloads beyond certain age ratings), Screen Time restrictions, content restrictions for apps, communication safety tools (e.g. blurring nudity in Shared Albums or alerting when nudity is detected in FaceTime).

Kids Category: Apps designated in the “Kids” category must follow stricter privacy and content rules: no inappropriate content, better parental gates for certain actions, limited data collection and clear advertising rules.

Google Play

Google Play says apps must declare the target audience (age groups) during submission. If an app is aimed at children or includes content targeting children, it must comply with the “Designed for Families” or Families policy, which imposes stricter rules on content, data practices, ads etc.

Data & privacy restrictions for child-directed apps: Apps targeting children cannot collect certain data (e.g. precise location, advertising identifiers) or must severely limit what is collected. Also disallowed are certain tracking / use of sensitive personal data from children.

Restrict Declared Minors (RDM) setting: Developers have a setting so that apps can mark themselves as “inappropriate for minors”. If enabled, minors can’t download/purchase/install those apps, and subscriptions etc may be restricted, according to Android Developers Blog.

Ad policy & content policy: Apps under the Family / children target must ensure adverts are appropriate, avoid deceptive ads, not show adult or regulated content, gambling etc. Also ensure ads are displayed in ways that are safe (not interruptive etc.).

SafeSearch & default filters: For younger users (via Family Link etc.), Google defaults to safer settings (SafeSearch filtering for search, restricted content settings).

The European Commission will work with national authorities across EU member states to identify platforms that pose the highest risks to children and ensure consistent enforcement of child protection standards.

Under the Digital Services Act, online companies face strict obligations to tackle illegal and harmful content and ensure safety and transparency in digital environments. Failure to comply could result in hefty penalties and tighter oversight.

Baburajan Kizhakedath

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest

More like this
Related

Comcast Q3 2025 Revenue Declines 2.7% as Wireless Growth, Theme Parks and Peacock Drive Strategic Transition

Comcast reported revenue of $31.2 billion for the third...

The Rise of the Data Fabric: A Unified Approach to Enterprise Data

If you’ve ever worked with multiple business tools, you...

How AT&T’s investment in fiber network is accelerating FWA strategy in Houston

AT&T is accelerating its fixed wireless access (FWA) strategy...

India’s Broadband Subscribers Cross 995 mn in September 2025, Says TRAI — Reliance Jio Leads Market

India’s broadband market has achieved growth in terms of...