European Union questions big tech on child safety and gambling apps

The European Union is strengthening the pressure on big tech companies like Apple, Snapchat, and YouTube, demanding answers about how they protect children from harmful online content.
This move comes after 25 of the 27 EU countries agreed to explore new rules that could limit social media access for minors, inspired by Australia’s ban on under-16s using such platforms.
Currently, under the EU’s Digital Services Act (DSA), companies are required to keep users, especially children, safe from illegal or harmful content, particularly with gambling material. Brussels has now sent formal requests for information to Snapchat, Apple, and Google.
Snapchat must explain how it prevents children under 13 from joining and how it tackles drug and vape sales on its platform. Apple’s App Store and Google Play were asked how they block kids from downloading dangerous apps, such as those with sexual or gambling content, or “nudify” tools that create fake sexual images.
YouTube is also being questioned over how its recommendation system avoids pushing harmful content to children. Meanwhile, the EU is separately investigating Meta’s Facebook, Instagram, and TikTok for potentially addictive design features targeting young users.
Most EU nations support setting a digital age limit, but Belgium and Estonia prefer focusing on digital education instead of bans.
With a degree in linguistics and translation, I create content that speaks the language of iGaming. My passion for turning topics into content that connects, informs, and entertains led me to specialize in writing for the iGaming industry. Over the past year with AffPapa, I have covered industry insights with different news, articles, and opinion pieces.