UK Regulators Issue Final Deadline for Tech Giants to Shield Children from Harmful Content
LONDON — Britain’s media and privacy watchdogs have issued a stern ultimatum to the world’s largest social media platforms, demanding they implement “highly effective” measures to block children from accessing inappropriate content or face multi-million-pound fines.
In a coordinated move on Thursday, March 12, 2026, Ofcom and the Information Commissioner’s Office (ICO) sent an open letter to Meta (owner of Facebook and Instagram), TikTok, Snap, and YouTube, as well as the gaming platform Roblox. The regulators warned that these “household names” are currently failing to uphold their own minimum age requirements and are exposing minors to addictive, harmful algorithmic feeds.
A Hard Deadline for Compliance
Under the latest enforcement phase of the UK Online Safety Act, the tech giants have been given until April 30 to prove they have tightened their safety protocols. Key demands include:
- Stricter Age Assurance: Implementing modern verification tools such as facial age estimation, open banking ID checks, or mobile operator validation to stop children under 13 from accessing services not designed for them.
- Algorithmic Safety: Configuring recommender systems to automatically filter out content related to suicide, self-harm, eating disorders, and pornography from children’s feeds.
- Communication Restrictions: Preventing unknown adults from contacting children and banning the testing of new, unvetted products on minor users.
“These online platforms are widely recognized in households, yet they are failing to prioritize children’s safety in the design of their products,” said Melanie Dawes, Chief Executive of Ofcom. “This situation cannot continue; if immediate changes are not implemented, Ofcom is ready to take strong enforcement action.”
Escalating Penalties
The warning comes as the UK weighs even tougher curbs, including a potential total ban on social media for under-16s, mirroring recent legislation passed in Australia. While a parliamentary bid to immediately bar under-16s was voted down this week, the government has launched a landmark consultation to explore such a ban, as well as mandatory “The proposed measures include mandatory overnight curfews and strict curbs on ‘addictive’ design elements, such as infinite scrolling.”
Regulators have made it clear they will not wait for new legislation to penalise non-compliance. Ofcom can fine companies up to 10% of their global annual revenue, while the ICO can levy fines of up to 4%. The seriousness of these threats was underscored last month when the ICO fined Reddit nearly £14.5 million for failing to implement meaningful age checks.
Tech Sector Response
In response, a spokesperson for Meta stated the company already utilises AI-based age detection and suggested that age verification should be handled centrally at the app store level. A YouTube representative expressed “surprise” at Ofcom’s shift away from a risk-based approach, urging a focus on “high-risk services” instead.
As the April 30 deadline approaches, the pressure on Silicon Valley to secure the digital lives of British children has never been higher.
1 Comment