Media regulators ask technology companies to introduce stricter age verification for children
Written by Madeleine Clarke - 13 April 2026
Media regulator Ofcom and data watchdog the Information Commissioner's Office (ICO) have asked technology companies to bring in stricter age verification checks for under 13s in the UK.
They contacted major media platforms including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox and X requesting they show how their current age checks meet new expectations. This is part of the ICO’s Children’s Code Strategy which aims to improve online safety and privacy for young people.
As it stands, many social media platforms and websites rely on users self-reporting their ages. Ofcom’s research indicates that 86% of children aged 10-12 have their own social media profiles. This is despite most platforms requiring users to be at least 13 years old.
In February the ICO fined Reddit £14.47 million, and MediaLab (which owns Imgur) £247,590, for not having proper age checks and for processing children’s personal data unlawfully. The ICO has since opened up an investigation into TikTok’s handling of children’s data and in December 2025, it requested information from Meta about Instagram’s recommender systems.
Ofcom would like media companies to use “highly-effective age checks” to ensure under 13s can’t bypass these rules. Currently, such checks are only legally required for providers of over-18s content, such as adult entertainment.
The ICO is focusing on the processing of under 13s’ data. If a platform sets a minimum age requirement of 13, it generally has no legal basis to handle the data of under 13s.
In its open letter to social media platforms, the ICO said companies must adopt up-to-date age verification, such as facial age estimation or digital ID checks and should consider raising the digital age of consent and restrict addictive app features like “streaks” and “infinite scrolling.”
YouTube responded to Ofcom’s appeal by urging the regulator to focus on high-risk services that are failing to comply with the Online Safety Act instead, saying that it routinely updates Ofcom on its work to protect young people. Meta said it has already put many of Ofcom’s suggestions in place already, including using AI to detect a user’s age based on activity and using facial estimation technology. TikTok uses technology to remove suspected underage accounts. It removed over 90 million accounts suspected to be run by under 13s between October 2024 and September 2025.
Snapchat is currently testing age verification tools, whilst Roblox said it has additional protections for under 13s and has introduced new mandatory age checks which must be completed before a user can access chat features.
Last month, MPs voted against a total social media ban for under 16s (The Memo: Social media ban for under 16's - will it happen in the UK? - Chambers Student Guide). Instead, they backed more flexible ministerial powers. The British government is currently consulting children and parents on social media use among young people.
Ofcom enforces the Online Safety Act, legislation that came into force in July 2025 and mandates age verification for adult content, such as violent images and videos.
When the Online Safety Act was introduced, gamers quickly discovered that age verification checks on Reddit and Discord could be bypassed by using photo-realistic images of characters from video games. With this in mind, it is not certain that stricter verification checks for under 13s would necessarily be effective.