London24NEWS

Tech giants should cease dangerous algorithms bombarding youngsters or face being blocked

Tech firms must stop their algorithms recommending harmful content relating to suicide, eating disorders and porn to children, under new Ofcom plans.

The media regulator will order social media sites to minimise children’s exposure to other serious harms, including violent, hateful or abusive material, online bullying, and content promoting dangerous challenges. Online firms will be forced to put in place robust age-checks and introduce simple ways for children to stop harmful content being recommended to them.

If tech companies fail to comply with the guidance, they could be fined up to 10% of their global turnover or have their services blocked in the UK. Ofcom on Wednesday published its draft Children’s Safety Codes of Practice, which sets out more than 40 practical measures tech giants must implement to meet new legal duties under the Online Safety Act.






Molly Russell's dad Ian said Ofcom's draft guidance needed to be 'more ambitious'


Molly Russell’s dad Ian said Ofcom’s draft guidance needed to be ‘more ambitious’
(
Daily Mirror)

But the dad of Molly Russell, who took her own life at age 14 in 2017 after being exposed to harmful content online, said the regulator’s draft guidance needed to be “more ambitious”. Ian Russell said: “Ofcom’s task was to seize the moment and propose bold and decisive measures that can protect children from widespread but inherently preventable harm. The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life.”

It comes after he told the Mirror on Tuesday “too little has changed” since Molly’s death and said urgent action is needed to stop the spread of online harms. “The cost of being slow in terms of online regulation is paid in young human lives,” he added.

Ofcom admitted its final code may not be published until summer 2025, after which tech firms will have three months to assess the guidance. Parliament will need to approve the code which could also take time.

The Online Safety Act was originally proposed by Theresa May in 2019 but it took years to become law due to political chaos and divisions over the scope of the bill. While it finally became law in October, Ofcom can’t use new powers to hold social media firms to account until the end of a lengthy consultation on updating its guidance.

Watch the Mirror’s new YouTube show Party Games as politicians spill secrets of being an MP

undefined

Join us for the Mirror’s new show Party Games as politicians spill the secrets of being an MP – while taking on the challenge of playing a well-known board game.

In the programme on the Mirror’s YouTube channel, familiar faces from across political spectrum will do battle with reporter Sophie Huskisson. At the same time they will face questions on who they are, what they stand for and why they became a politician.

In a relaxed tell-all chat over games including Kerplunk, Jenga and Snakes and Ladders, we hear about how they manage their work-life-balance, how they deal with social media trolls and about some of their worst and best times in Westminster.

Party Games is available now on the Mirror’s YouTube channel with new episodes every Monday at 6pm.

James Bowen, assistant general secretary at school leaders’ union NAHT, said: “School leaders see first-hand the harm that can be caused to children by exposure to inappropriate material online. There are some sensible proposals in this draft code of practice, which could be a positive first step in the right direction. The key question will be whether or not they make a real difference in practice.”

Children’s Commissioner Dame Rachel de Souza said: “Protections in the Online Safety Act must be implemented swiftly, with effective age assurances, default safety settings and content moderation to prevent children from accessing platforms underage and keeping them safe online as they explore, learn and play. I will continue to work with Ofcom, policy makers, government, schools and parents to ensure that children safe kept safe online.”

Dame Melanie Dawes, Ofcom’s chief executive, said: “We want children to enjoy life online. But for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”

Technology Secretary Michelle Donelan admitted making the online world safe is a “complex journey”. “I want to assure parents that protecting children is our number one priority and these laws will help keep their families safe,” she said. “To platforms, my message is engage with us and prepare. Do not wait for enforcement and hefty fines – step up to meet your responsibilities and act now.”

‘Makes my blood freeze’

Dr Rebecca Whittington – Online safety editor at Reach PLC gives her verdict: As a parent, hearing the tragic story of what happened to Molly Russell makes my blood freeze. When Molly took her own life she was only a couple of years older than my children are now. The knowledge that thousands of teens die by suicide every year, many of whom will have likely been made more vulnerable by content they have viewed online, is terrifying and horrific.

As parents, we can monitor and negotiate, we can ban and track, we can confiscate devices or try to trust our children and the people they connect with online, but there are absolutely no right answers about how to stop our children being harmed by online content.

We and our children need help: that of course includes frameworks, guidelines and regulations.

But our government and OFCOM need to up the pace of change. If it’s not going to be at least another 18 months before rules are ratified, then what might happen to all those young people and their guardians who need help now?

Ian Russell estimates, based on national figures, that four young people a week die by suicide. That’s more than 300 in the time between now and when OFCOM suggests its code could be made official. Not all of those young people will have been tainted by toxic online content. But, due to the prolific access and exposure to harmful media online, I imagine the majority will have experienced some online harm during the course of their short lives.

It is impossible to keep up with the frantic pace of big tech development as it is. If red tape continues to build barriers to legislative change and regulation how can we ever expect things to change and for online spaces to be safe spaces for young and vulnerable users?

What we actually need is for the platforms to take responsibility for the toxic content they allow on their services. The content that is pushed relentlessly to users, who as teenagers and young people could already have significant challenges around self-identity and self-esteem. Thoughts and doubts that are ripe for toxic and harmful online content to latch onto and build into full belief. The content that prays on fear and isolation and thrives in the echo chamber of platforms and the addictive nature of social media.

It is terrible that we are in a situation as a society where we have to regulate platforms as they alone seem completely unwilling to gatekeep what content they host. How has it come to this when, as so eloquently put by Ian Russell, social media giants are monetising misery and being “paid in human lives” and all we can do is apply a sluggish system to make rules that we can only hope will make a difference?