Wildest issues now you can say on Facebook and Instagram as specialists sound alarm
EXCLUSIVE: The move will lead to some tangible changes on the platform, which has around 3billion users worldwide, not least giving people the freedom to use previously-banned words
Meta boss Mark Zuckerberg has peeled back restrictions on what can go on Facebook, Instagram and Threads. The move has been met with a spectrum of praise and fear, as the company removes fact-checking and targets a return to prioritising “free expression.”
Meta claims that its elaborate network of fact checkers had “gone too far” and now feels that: “As well-intentioned as many of these efforts have been, they have expanded over time to the point where we are making too many mistakes, frustrating our users and too often getting in the way of the free expression we set out to enable.”
The move will lead to some tangible changes on the platform, which has around 3billion users worldwide, not least giving people the freedom to use previously-banned words.
Some alterations are likely to heavily impact particular communities. Among these changes is the removal of an injunction which banned people from calling transgender or non-binary people “it”.
Elsewhere in its newly-updated rules on hateful conduct, Meta has said: “We do allow allegations of mental illness or abnormality when based on gender or sexual orientation, given political and religious discourse about transgenderism and homosexuality and common non-serious usage of words such as ‘weird’.
According to the Guardian, it has also deleted warnings relating to self-admission of racism, homophobia and Islamophobia, as well as deleting warnings for expressions of hate including “c***”, “d***” and “a******”.
The move, experts fear, could open the door to lower standards online.
Tal-Or Cohen Montemayor is the founder and executive director of CyberWell, an independent nonprofit focused on combatting online antisemitism and Holocaust denial on social media.
She told the Daily Star: “Meta’s recent announcement is not just about the rollout of Community Notes, following their fellow platforms X and YouTube – it is a systematic lowering of the bar on how Meta intends to enforce their Community Standards against hate speech and harassment online.”
She noted that her firm was “deeply concerned” by the news, adding “This change means one thing… more hate speech, more politicised content, more silos and less effective responses from the platforms,” and “particularly undermines the safety of all marginalized communities”.
The issue, it has been claimed, comes in relation to the change in president on January 20.
“I think it’s really been driven by the change of government in the US, with Trump coming back into office,” Sebastian Ellis, 39, Managing Director of London-based digital marketing agency Ellis Digital told the Star. “ He’s been very out there with his opinions with how Meta tried to allegedly cause problems with his campaign. A lot of Covid things were reduced or suppressed on the Meta network, and some people are arguing on the line that [Elon Musk’s] X way of doing things are the right way.”
The fact checking system has, some claim never ben perfect, but fears remain that it offered a better solution to alternatives.
Pija Ona Indriunaite, Brand Manager and Social Media expert at Omnisend, noted that: “Fact-checking has never been a perfect solution to fake news and conspiracies… Given the sheer volume of content produced and spread across social media, it’s nearly impossible to track and evaluate everything effectively”.
She too, however, feared what the long-term consequences of the move could be. “On the other hand, even if fact-checking was inefficient, it provided a sense of security – like a safety net. It symbolised that someone was watching, ready to intervene in cases of malicious intent. Removing it feels like a withdrawal of support from the platform, leaving users exposed and defenseless.”
Bob Hutchins is the author of Our Digital Soul: Collective Anxiety, Media Trauma, and Path Toward Recovery, flagged that the move could lead to a movement away from Facebook for some users, noting how “the immediate consequences will likely be disruptive. Platforms will become noisier, less reliable, and emotionally draining. This could lead to fatigue and disengagement, pushing users toward smaller, curated spaces where trust is easier to establish.”
Perhaps most concerningly, Mark Abbott, a Senior Associate and Parliamentary Agent at law firm Bates Wells, explained how. “The move feels like an abdication of responsibility by Meta. ‘Freedom of speech’ does not mean allowing damaging misinformation to go unchallenged. Zuckerberg’s rationale is that ‘even experts have biases’ and therefore misinformation is better tackled by requiring the public at large to decide what is misinformation and what is not. But something can be popular without being correct; and can be correct without being popular.”
The Daily Star has contacted Meta for comment.