London24NEWS

Social media giants given ‘closing deadline’ to cease children accessing dangerous content material

Tech firms have been given a final deadline to introduce “robust” age checks to stop children accessing harmful content on their platforms.

Media regulator Ofcom has ordered online services to take urgent action to stop kids seeing content relating to pornography, suicide, self-harm and eating disorders. All social media must introduce age assurances by July 2025 or risk punishment under the Online Safety Act. This applied to sites like YouTube, Facebook, Instagram, TikTok or Twitter/X.

Under the measure, online sites will not have to introduce age checks for the whole of their platform but will have to ensure they are in place in relation to harmful content. If they fail to do so, Ofcom has the power to fine them up £18million or up to 10% of their global revenue or impose other business disruption measures, such as requiring payment providers or advertising services to withdraw from an online site.

Be the first with news from Mirror Politics

BLUESKY: Follow our Mirror Politics account on Bluesky here. And follow our Mirror Politics team here – Lizzy Buchan, Jason Beattie, Kevin Maguire, Sophie Huskisson, Dave Burke, Ashley Cowburn, Mikey Smith

POLITICS WHATSAPP: Be first to get the biggest bombshells and breaking news by joining our Politics WhatsApp group here. We also treat our community members to special offers, promotions, and adverts from us and our partners. If you want to leave our community, you can check out any time you like. If you’re curious, you can read our Privacy Notice.

NEWSLETTER: Or sign up here to the Mirror’s Politics newsletter for all the best exclusives and opinions straight to your inbox.

PODCAST: And listen to our exciting new political podcast The Division Bell, hosted by the Mirror and the Express every Thursday.

Further measures will be published in April including the Protection of Children Codes. It is expected to include guidance on age checks to ensure under-age young people aren’t accessing social media sites in the first place. Most sites do not allow under 13s on their platforms however evidence shows many kids aged 12 and younger lie about their age to create an account.

While the Online Safety Act became law in October 2023, Ofcom has not started using its new powers yet as it has been consulting on the new guidance. Technology Secretary Peter Kyle admitted at the weekend(SUN) that the UK’s online safety laws are “very uneven [and] unsatisfactory”. He said MPs need to get into a better cycle of “updating” current laws due to the extremely fast pace technology develops.

Dame Melanie Dawes, Ofcom’s chief executive, said: “For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don’t ask or, when they do, the checks are minimal and easy to avoid.

“That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.“