Dad of woman, 14, who took personal life after seeing dangerous Instagram content material reacts to Meta modifications
The dad of a 14-year-old girl who took her own life after seeing harmful material online has hit out at Meta’s decision to scrap its longstanding fact-checking programme.
Ian Russell, whose daughter Molly died in 2017, said the move by the tech firm, which owns Facebook and Instagram, is a “major concern for safety online”. Meta has announced it will use a community notes system similar to that on Elon Musk’s social media platform X, formerly Twitter.
Meta boss Mark Zuckerberg’s policy signals a move towards a more conservative-leaning focus on free speech over fact-checking. Mr Musk ripped up X’s moderating rules when he took over the tech firm. In recent days the Tesla boss has been in a public row with Keir Starmer over the grooming gangs scandal – which has seen the PM hit back at “those who are spreading lies and misinformation”.
(
Daily Mirror)
Currently Meta uses news organisations or other third-party groups to check content but its community notes system will now see users add notes to posts that might be false or misleading. Mr Russell, chair of the Molly Rose Foundation, a suicide prevention strategy set up in his daughter’s name, pointed to Meta’s own data that showed Facebook and Instagram removed almost 12 million pieces of suicide and self-injury content in the last quarter. Over 99% of this was found by its moderators.
Meta boss Mark Zuckerberg admitted the changes announced means the social media giant is “going to catch less bad stuff” but that it would also stop innocent people’s posts or accounts being accidentally taken down. The tech mogul said fact-checkers have been “too politically biased” and said he wanted to restore “our roots and focus” on free expression on his platforms.
The changes will affect Facebook and Instagram, the company’s two largest social media platforms which have billions of users, as well as its newer platform Threads. Meta said it plans to bring in the community notes function in the US over the next few months and will “continue to improve it” over the year. It will also stop demoting fact-checked posts and make labels indicating when something is misleading less “obtrusive”.
Mr Russell said: “Meta’s decision to roll back on content moderation is a major concern for safety online. We are dismayed that the company intends to stop proactive moderation of many forms of harmful content and to only act if and when a user complaint is received. Meta currently claims the overwhelming majority of harmful material they remove is found by themselves rather than reported by users.
(
AFP via Getty Images)
“We are urgently clarifying the scope of these measures, including whether this will apply to suicide, self-harm and depressive content. These moves could have dire consequences for many children and young adults.”
Chris Morris, chief executive of Full Fact, said: “Meta’s decision to end its partnership with fact checkers in the US is disappointing and a backwards step that risks a chilling effect around the world. From safeguarding elections to protecting public health to dissipating potential unrest on the streets, fact checkers are first responders in the information environment.
“Our specialists are trained to work in a way that promotes credible evidence and prioritises tackling harmful information – we believe the public has a right to access our expertise. We absolutely refute Meta’s charge of bias – we are strictly impartial, fact check claims from all political stripes with equal rigour, and hold those in power to account through our commitment to truth.
“Like Meta, fact checkers are committed to promoting free speech based on good information without resorting to censorship. But locking fact checkers out of the conversation won’t help society to turn the tide on rapidly rising misinformation.”
Be the first with news from Mirror Politics
BLUESKY: Follow our Mirror Politics account on Bluesky here. And follow our Mirror Politics team here – Lizzy Buchan, Jason Beattie, Kevin Maguire, Sophie Huskisson, Dave Burke, Ashley Cowburn, Mikey Smith
POLITICS WHATSAPP: Be first to get the biggest bombshells and breaking news by joining our Politics WhatsApp group here. We also treat our community members to special offers, promotions, and adverts from us and our partners. If you want to leave our community, you can check out any time you like. If you’re curious, you can read our Privacy Notice.
NEWSLETTER: Or sign up here to the Mirror’s Politics newsletter for all the best exclusives and opinions straight to your inbox.
PODCAST: And listen to our exciting new political podcast The Division Bell, hosted by the Mirror and the Express every Thursday.
Mr Zuckerberg said: “We’re going to get back to our roots and focus on reducing mistakes, simplifying our policies and restoring free expression on our platforms. More specifically, here’s what we’re going to do. First, we’re going to get rid of fact checkers and replace them with community notes similar to X, starting in the US. It means that we’re going to catch less bad stuff, but we’ll also reduce the number of innocent people’s posts and accounts that we accidentally take down.”
Mr Zuckerberg shift to a focus on free speech on his platforms comes after he met Donald Trump in November after he won the US election. A community notes system is likely to please the president-elect, who criticised Meta’s fact-checking feature for penalising conservative voices.
Meta donated 1 million dollars to support Mr Trump’s inauguration in December, and has since appointed several Trump allies to high-ranking positions at the firm. Nick Clegg, the former UK deputy prime minister, left the social media giant last week, where he had been president of global affairs. Mr Clegg has been replaced by Joel Kaplan, a prominent Republican and former senior adviser to George W Bush. Dana White, the head of the Ultimate Fighting Championship and a close ally of Mr Trump, was also appointed to Meta’s board.
In a statement, Mr Kaplan added that Meta’s moderation policies had “gone too far”. Referring to its incoming system, he said: “We’ve seen this approach work on X – where they empower their community to decide when posts are potentially misleading and need more context, and people across a diverse range of perspectives decide what sort of context is helpful for other users to see.”
Since Mr Musk bought X in 2022 it has faced heavy criticism over its approach to posts containing misinformation or hateful content.