Finance

UK watchdogs press Meta, TikTok, Snap and YouTube to block children

Published by Global Banking & Finance Review

Posted on March 12, 2026

3 min read

· Last updated: April 1, 2026

Add as preferred source on Google
UK watchdogs press Meta, TikTok, Snap and YouTube to block children
Global Banking & Finance Awards 2026 — Call for Entries

LONDON, March 12 (Reuters) - Britain's media and privacy regulators on Thursday demanded that major social media platforms do more to keep children off their services, warning that companies were

UK watchdogs press Meta, TikTok, Snap and YouTube to block children

Regulators Demand Stronger Child Protection Measures from Social Media Platforms

By Sam Tabahriti

LONDON, March 12 (Reuters) - Britain's media and privacy regulators on Thursday demanded that major social media platforms do more to keep children off their services, warning that companies were failing to enforce their own minimum age rules.

Britain has been weighing tougher curbs on children's access to social media, with the government considering barring under 16s from such platforms - mirroring a move by Australia.

Concerns Over Algorithmic Feeds and Harmful Content

Ofcom and the Information Commissioner's Office said they had grown increasingly concerned about algorithmic feeds that expose children to harmful or addictive content.

"These online services are household names, but they're failing to put children's safety at the heart of their products," Melanie Dawes, Ofcom's chief executive, said.

"That must now change quickly, or Ofcom will act."

Regulatory Actions and Demands

Use of Modern Technology for Age Assurance

USE 'MODERN' TECH, COMPANIES TOLD

In the latest implementation phase of Britain's Online Safety Act, Ofcom told Facebook and Instagram - both owned by Meta - as well as Roblox, Snapchat, ByteDance's TikTok and Alphabet's YouTube to show by April 30 how they would tighten age checks, restrict strangers from contacting children, make feeds safer and stop testing new products on minors.

The ICO separately issued an open letter to the same platforms, calling on them to adopt "modern, viable" age-assurance tools to stop those under 13 accessing services not designed for them.

"There's now modern technology at your fingertips, so there is no excuse," Paul Arnold, ICO's chief executive, said.

Responses from Major Platforms

Meta

A Meta spokesperson said the company already uses AI-based age detection and age-estimation tools and places teens in accounts with built-in protections. The spokesperson added that age should be verified "centrally at the app store level" so families do not have to provide personal information multiple times.

YouTube

A YouTube spokesperson said the platform also offered age-appropriate experiences and was "surprised to see Ofcom move away from a risk-based approach", urging the regulator to focus on "high-risk services" that were failing to comply with the law.

Roblox

A Roblox spokesperson said the company had launched more than 140 new safety features in the past year, including mandatory age checks for chat, designed to prevent adults from communicating with children.

"While no system is ever perfect, we continue to strengthen protections designed to keep players safe," the spokesperson said.

Snapchat and TikTok

Snapchat did not respond to a request for comment. TikTok declined to comment.

Potential Penalties and Recent Enforcement

Ofcom can fine companies up to 10% of their qualifying global revenue, while the ICO can issue fines of up to 4% of a company's global annual turnover.

The privacy watchdog last month fined Reddit nearly 14.5 million pounds for failing to introduce meaningful age checks and for processing children's data unlawfully.

($1 = 0.7439 pounds)

(Reporting by Sam TabahritiEditing by Paul Sandle, Tomasz Janowski and Louise Heavens)

Key Takeaways

  • Ofcom and ICO are pushing platforms like Meta, TikTok, Snap and YouTube to tighten age-assurance, restrict contact by strangers, and safeguard algorithmic feeds by April 30, threatening hefty fines for non-compliance (up to 10% Ofcom / 4% ICO of global turnover).
  • The Online Safety Act requires “highly effective” age assurance technologies—such as open banking ID checks, facial estimation or mobile operator validation—and Ofcom guidance from January 2025 defines which methods qualify as robust enough.
  • The UK government is consulting (March 2026) on more aggressive rules—including potentially barring under‑16s from social platforms, banning addictive features like infinite scroll, and extending OSA duties to AI chatbots.

References

Frequently Asked Questions

Which UK regulators are demanding stricter age checks on social media?
Ofcom and the Information Commissioner's Office are urging major platforms to tighten age verification.
What could happen if social media companies fail to comply with UK regulations?
Ofcom can fine companies up to 10% of their global revenue, and ICO can issue fines up to 4% of annual turnover.
Which social media platforms are targeted by the UK's new demands?
Meta (Facebook, Instagram), TikTok, Snap, YouTube, and Roblox have been given until April 30 to outline compliance.
What new requirements must social media firms meet under the Online Safety Act?
They must implement tighter age checks, restrict stranger contact with minors, make feeds safer, and stop testing new products on children.
Has any company already been fined under these privacy concerns?
Yes, Reddit was fined nearly 14.5 million pounds for failing to implement effective age checks and protect children's data.

Tags

Related Articles

More from Finance

Explore more articles in the Finance category