Social media firms should be forced almost immediately to stop children seeing harmful content, voters have demanded.
A Mirror poll has revealed the public want rapid action to prevent algorithms bombarding kids with material relating to suicide and eating disorders.
Six in 10 (60%) want this to happen by the end of July, while a further 22% said it should be before the end of this year. Just 2% were against it, according to the survey conducted by Whitestone Insight.
Most people are also in favour of forcing online giants to introduce robust age checks on social media, such as requiring photo ID. Some 43% want this by the summer, while 26% believe it should be in place by the end of this year. Only 8% say they’re happy to wait until some point next year. Just 8% were against the proposal.
The idea of banning under-16s from having smartphones is not widely supported, however. Some 43% are against, while 36% are in favour.
Ofcom has put forward proposals that would require tech firms to stop their algorithms recommending harmful content to children and introduce robust age checks. But the regulator has said its final code may not be published until summer 2025, after which companies will have three months to assess the guidance. Parliament will need to approve the code, which could also take time.
The dad of Molly Russell, who took her own life at age 14 in 2017 after being exposed to harmful content online, last night said the issue was a “matter of life or death” as he responded to the poll findings.
(
PA)
The Online Safety Act was originally proposed by Theresa May in 2019 but it took years to become law due to political chaos and divisions over the legislation. While it finally became law in October, Ofcom can’t use new powers to hold social media firms to account until the end of a lengthy consultation on updating its guidance.
Ofcom earlier this month published its draft Children’s Safety Codes of Practice, which sets out more than 40 practical measures tech giants must implement to meet new legal duties. If tech companies fail to comply with the guidance when it is introduced, they could be fined up to 10% of their global turnover or have their services blocked in the UK.
The mother of murdered teen Brianna Ghey has warned that more young lives could be lost due to delays to the social media crackdown.
Earlier this month, Esther Ghey told the Mirror: “I totally understand Ofcom’s view that we need to cross the Ts and dot the Is, and everything needs to be perfect so social media companies can’t come back and take legal action. But during this time we’re potentially losing more children’s lives. Also, there are children who are struggling with their mental health, there are children who are self-harming and we do need to try to get this done as soon as possible.”
Whitestone Insight interviewed 2,024 adults in Britain online on May 15 and 16.
‘There is no time to waste,’ says dad who lost his daughter
The dad of a 14-year-old girl who took her own life after being exposed to harmful content online says it is a “matter of life or death” whether a clampdown on social media takes place urgently.
Molly Russell’s father Ian said there is “no time to waste” as he responded to our exclusive poll findings.
Molly, from Harrow, in north west London, saw more than 2,000 harmful posts about suicide, self-harm and depression in the last six months of her life.
Mr Russell said: “It’s only effective regulation that will finally make big tech address the preventable harm they cause to children and families. The cost of years of industry inaction should be measured in lost young lives, and that’s why getting world-leading regulation in force is nothing less than a matter of life or death.”
(
Daily Mirror)
The campaigner added: “There is no time to waste to protect young people from harm. Social media companies have consistently put their bottom line before taming toxic algorithms and the results have been devastating. The next Government should commit to new legislation to strengthen the Online Safety Act regime as this will ensure we see the swiftest and most decisive change. Effort wasted on other, hastily conceived policies is unlikely to be as effective and may end up causing more harm than good.”
An inquest in 2022 concluded that Molly died from an act of self-harm while suffering depression and the negative effects of online content. Coroner Andrew Walker said the images of self-harm and suicide she viewed “shouldn’t have been available for a child to see”.
*If you’re struggling and need to talk, the Samaritans operate a free helpline open 24/7 on 116 123. Alternatively, you can email jo@samaritans.org or visit their site to find your local branch