London24NEWS

Safety campaigners urge Ofcom over Roblox ‘paedophile hellscape’ fears

Child safety campaigners in Britain today urged Ofcom to act after a bombshell report branded online game platform Roblox an ‘X-rated paedophile hellscape’.

The popular game in which young players create or play in virtual universes has been accused of exposing children to ‘grooming, pornography and violent content’.

US investment firm Hindenburg Research issued damning findings after a lengthy investigation into the controversial platform which has no set age restrictions.

Now, campaigners say the report shows UK communications watchdog Ofcom must make a ‘step change’ as it implements then enforces the Online Safety Act (OSA).

The new law is due to start coming fully into force next year, and will place new duties on social media sites for the first time. The largest and most popular, as well as those which count children among their users, are set to face the strictest rules.

Platforms must put in place and enforce safety measures to ensure that users, and in particular young people, do not encounter illegal or harmful content – and if they do that it is quickly removed, with those who break the rules facing large fines.

Among the campaign groups urging Ofcom to act is the Molly Rose Foundation – established by the parents of 14-year-old Molly Russell from Harrow, North West London, who took her own life in 2017 after viewing harmful online content.

Popular gaming platform Roblox has 'digital strip clubs', according to the Hindenburg study

Popular gaming platform Roblox has ‘digital strip clubs’, according to the Hindenburg study

Gaming platform Roblox has become successful with children and has 80million daily users

Gaming platform Roblox has become successful with children and has 80million daily users

Roblox was accused of exposing children to 'grooming, pornography and violent content'

Roblox was accused of exposing children to ‘grooming, pornography and violent content’

Among the campaign groups urging Ofcom to act is the Molly Rose Foundation - established by the parents of 14-year-old Molly Russell from Harrow, North West London, who died in 2017

Among the campaign groups urging Ofcom to act is the Molly Rose Foundation – established by the parents of 14-year-old Molly Russell from Harrow, North West London, who died in 2017

The charity’s chief executive Andy Burrows told MailOnline today: ‘Parents will be understandably shocked that a platform aimed at younger people can be so reckless with the mental health and safety of child users.

Charities urge Government to target smaller websites under Online Safety Act 

A group of charities and online safety campaigners have written to the Prime Minister, urging him to ignore advice from Ofcom around which websites to categorise as the most dangerous under the Online Safety Act.

The group of campaigners said the regulator’s advice that smaller websites should not be designated Category 1 – the rating which gives Ofcom the greatest scope of powers for oversight and regulation of that platform – left a number of ‘the most dangerous online forums’ not fully in scope of the regulation.

In guidance to the previous Conservative government, published in March, Ofcom proposed setting the threshold for what should be a considered a Category 1 service under the new rules as those which disseminated content easily, quickly and most widely, proposing among other things, that it should be for sites with at minimum, more than seven million UK users.

But, in an open letter to the Prime Minister, the campaigners argue that this approach would leave a number of smaller, but dangerous ‘suicide forums’ free of the most stringent rules, and urged the Technology Secretary Peter Kyle to use powers that enable him to determine which sites should be placed in Category 1 ‘based on functionality and other characteristics alone rather than requiring that they also be of a certain size’.

‘This would allow a limited number of small but exceptionally dangerous forums to be regulated to the fullest extent possible,’ the letter says.

‘These include forums that are permissive of dangerous and hateful content as well as forums that explicitly share detailed or instructional information about methods of suicide or dangerous eating disorder content.

‘Given the cross-party support for such an approach to regulation of these platforms, we were dismayed to see that Ofcom, in its recently published advice to the previous Secretary of State on categorisation, explicitly recommended not using this power to address these extremely dangerous sites.’

The open letter has been signed by a number of leaders from charities including Samaritans, Mind, the Mental Health Foundation, the Molly Rose Foundation and online safety groups such as the Centre for Countering Digital Hate and bereaved families.

The letter highlights a report which links one such forum to ‘at least 50 UK deaths’, adding ‘we understand that the National Crime Agency is investigating 97 deaths in the UK thought to be related’ to the site in question.

The group argues that this ‘highly dangerous suicide forum’ should be regulated ‘at the same level as sites like Facebook and Instagram’ in order to make them ‘accountable’ for the content they allow to appear on their platform.

The letter also notes that there are similar issues around sites hosting antisemitic and Islamophobic content, as well as smaller platforms being used to ‘stoke this summer’s racist riots’.

‘We would argue that the events of the summer, in tandem with the ongoing human cost of a growing number of suicides, are sufficient evidence in themselves to justify the Secretary of State deciding to divert from Ofcom’s advice and set the categorisation thresholds for the regime in the most robust and expansive way the Act allows,’ the letter says.

‘Ofcom’s current recommendations, which involve services having content recommendation systems, and having the functionality for users to forward or re-share content, in addition to having a large size, would do nothing at all to address the services we are concerned about.

‘We hope that you will be able to take action on addressing this major oversight in the advice that the government has been given by Ofcom.’

Under the Online Safety Act, which is due to start coming fully into force next year, and will place new duties on social media sites for the first time, with the largest and most popular, as well as those which count children among their users, set to face the strictest rules.

Platforms will be required to put in place and enforce safety measures to ensure that users, and in particular young people, do not encounter illegal or harmful content, and if they do that it is quickly removed, with those who do not adhere to the rules facing large fines.

An Ofcom spokesperson said: ‘There should be no doubt that these sorts of harmful websites will be tightly regulated.

‘From next year, any sites that don’t comply with their illegal content and child safety duties will be in breach of our regulations, and we will use the full extent of our powers to take action against them.

‘Additional duties such as producing transparency reports will be a powerful tool in making larger platforms safer. But they would do little to tackle the harm done by smaller, riskier sites – and could even attract attention to them.’

A Government spokesperson said: ‘Too many people are affected by the tragedy of suicide, which is so often preventable.

‘The Secretary of State is working steadfast to deliver the Online Safety Act, which will stop children seeing material that promotes self-harm and suicide.

‘He recently wrote to Ofcom to request an update on how it intends to monitor such services, using the full force of their enforcement powers.’

Advertisement

‘Ofcom will rightly be judged by whether it takes swift and comprehensive action against platforms that seem asleep at the wheel on online safety risks.’

‘This report underscores the growing evidence that child safety shortcomings aren’t a glitch but rather a systemic failure in how online platforms are designed and run.

‘The Online Safety Act remains the most effective route to keep children safe, but such preventable safety lapses will only be addressed if Ofcom delivers a step change in its ambition and determination to act.’

Roblox has become successful with young children – offering a series of different games and interactive tools, as well as the ability to create your own challenges .

But the Hindenburg study raised concerns about inappropriate links which are at a child’s fingertips when accessing the site.

The report states: ‘Core to the problem is that Roblox’s social media features allow paedophiles to efficiently target hundreds of children, with no up-front screening to prevent them from joining the platform.’

It also says there are ‘digital strip clubs, red light districts, sex parties and child predators lurking on Roblox’, which has 80 million daily users.

And Roblox was accused of being ‘an X-rated paedophile hellscape, exposing children to grooming, pornography, violent content and extremely abusive speech’.

Hindenburg also said it found multiple accounts named using variations of the name of disgraced paedophile financier Jeffrey Epstein.

Roblox rejected the allegations in Hindenburg’s report, insisting safety was ‘foundational’ to the company.

And Hindenburg has also stated that it is trying to profit from a fall in Roblox’s value by taking out a ‘short’ position on the company’s share price.

Online safety campaigner Beeban Kidron said the OSA should ‘significantly up the game’ on making sure there are in-built safety measures on tech platforms.

She told The Guardian: ‘Roblox is a consumer-facing product – and, in order to trade, it has to be safe for children. And it has to have by-design mechanisms that mean it does not enable predators to convene or search for children.

‘We need political will and leadership to strengthen the provisions of the OSA and a regulator willing to implement them.’

In response, an Ofcom spokesman told MailOnline: ‘The Online Safety Act will have a significant impact in creating a safer life online in the UK.

‘When the new duties come into force, platforms – such as Roblox – will be required to protect children from pornography and violence, take action to prevent grooming, remove child abuse images, and introduce robust age-checks.

‘We have set out clear recommended measures for how they can comply with these requirements in our draft codes. If platforms fail to comply when the time comes, we’ll have a broad range of enforcement powers to hold them accountable for the safety of their users.’

It comes after Colin Stitt, head of the Safer Schools campaign at INEQE Safeguarding Group, the largest independent safeguarding organisation in the UK and Ireland, urged parents to beware of potential dangers.

He told MailOnline last week: ‘The report is a stark reminder that we can’t simply assume a platform is safe for children just because it looks child-friendly.

‘Parents and carers need to be proactive and educate themselves about the potential risks their children may face online, including exposure to inappropriate content, online predators, and harmful social interactions.

‘It’s crucial for parents and carers to have open and honest conversations with their children about online safety, empowering them to navigate digital spaces responsibly.’

A Roblox spokesperson said in a statement in response to the Hindenburg study: ‘We totally reject the claims made in the report.

‘Safety and civility have been foundational to Roblox since our inception nearly two decades ago, and we have invested heavily throughout our history in our Trust and Safety efforts.

‘Every day, tens of millions of users of all ages have safe and positive experiences on Roblox and abide by the company’s community standards. However, any safety incident is horrible.

‘We take any content or behaviour on the platform that doesn’t abide by our standards extremely seriously and we have a robust set of proactive and preventative safety measures designed to catch and prevent malicious or harmful activity on our platform.’

Roblox also said that the company ‘fully’ intended to comply with the OSA.

The spokesman added: ‘Our internal teams have been assessing the obligations and have been engaging in the various consultations and calls for evidence Ofcom have published. We look forward to seeing Ofcom’s final codes of practice.’

It comes after Colin Stitt, head of the Safer Schools campaign at INEQE Safeguarding Group, the largest independent safeguarding organisation in the UK and Ireland, urged parents to beware of potential dangers.

He told MailOnline last week: ‘The report is a stark reminder that we can’t simply assume a platform is safe for children just because it looks child-friendly.

‘Parents and carers need to be proactive and educate themselves about the potential risks their children may face online, including exposure to inappropriate content, online predators, and harmful social interactions.

‘It’s crucial for parents and carers to have open and honest conversations with their children about online safety, empowering them to navigate digital spaces responsibly.’

Hindenburg Research teams investigating Roblox attempted to create accounts using names of infamous child molesters and criminals.

The Hindenburg study also claimed there are 'sex parties' on online gaming platform Roblox

The Hindenburg study also claimed there are ‘sex parties’ on online gaming platform Roblox 

Researchers in the US looking into Roblox found 600 hits when searching 'Diddy' games

Researchers in the US looking into Roblox found 600 hits when searching ‘Diddy’ games

Roblox was accused by Hindenburg Research of being 'an X-rated paedophile hellscape'

Roblox was accused by Hindenburg Research of being ‘an X-rated paedophile hellscape’

They tried to set one up under the name of Earl Brian Bradley – a paedophile convicted of molesting 103 children – but found the username was already taken, along with a host of variants, including earlbrianbradley69.

Children ‘doom scrolling’ on phones for hours a day ‘causing widespread harm’

Children who are ‘doom scrolling’ for hours a day on smartphones are at risk of widespread harm, an MP has warned.

The equivalent of ‘seatbelt’ legislation is needed for children and their social media use to help them manage addictive content, according to former teacher Josh MacAlister.

Tomorrow, the Labour MP for Whitehaven and Workington will introduce a Private Member’s Bill (PMB) in Parliament on protecting children from harms caused by excessive screen time

The Bill, which aims to empower families and teachers to cut down on children’s daily smartphone screen time, will call for a legal requirement to be introduced so all schools in England are mobile-free zones.

In February, schools in England were given guidance under the former Conservative government intended to stop the use of mobile phones during the school day, but it is currently non-statutory.

The Bill is also expected to call for the age at which companies can get data consent from children without parental permission to be raised from 13 to 16 to make smartphones less addictive.

Other proposals include strengthening watchdog Ofcom’s powers to protect children from apps that are designed to be addictive, and committing the Government to review further regulation if needed of the design, supply, marketing and use of mobile phones by children under the age of 16.

Mr MacAlister, who led an independent review into children’s social care for the former government, said: ‘The evidence is mounting that children doom scrolling for hours a day is causing widespread harm. We need the equivalent of the ‘seatbelt’ legislation for social media use for children.’

Advertisement

There were 900 variations of the name ‘Jeffrey Epstein’, plus 600 hits when searching ‘Diddy’ games.

Some of these games were titled ‘Diddy Party’, ‘Run from Diddy Simulator’ and, separately, ‘Escape to Epstein Island.’

Youngsters on ‘child accounts’ – that is, under the age of 13 – were also able to search ‘adult’ to find inappropriate content.

According to Roblox, 21 per cent of users are under the age of nine.

Researchers have now said they were easily able to track down games and groups trading child pornography and soliciting sexual favors.

Roblox self-reported more than 13,000 incidents of child exploitation to the US National Center for Missing and Exploited Children least year.

The new research refers to various high profile cases of paedophiles using Roblox to groom victims.

In Wales in 2018, a 29-year-old man was arrested for grooming 150 children via the game.

Another man was charged in New Jersey after he kidnapped an 11-year-old girl he met while playing the game – later pleading guilty to one count of coercion and enticement of a minor.

Police told how Darius Matylewich took the girl from her home town after chatting on multiple gaming platforms, including Roblox .

And Arnold Castillo, 23 and also from New Jersey, was sentenced to 15 years in federal prison in 2023 after admitting transportation of a minor with intent to engage in criminal sexual activity and coercion and enticement of a minor.

He paid for an Uber to pick the 15-year-old girl up and bring her across state lines to his home, before sexually abusing her and trying to purchase ‘Plan B’ to ensure she did not fall pregnant.

Roblox has previously faced criticism from celebrities including ex-Brookside star Claire Sweeney, who told social media how her son was almost scammed online.

The actress revealed on X in December 2022 that seven-year-old Jaxon was playing on Roblox when she saw he had been sent a message instructing him to reveal her credit card numbers.

She posted: ‘Parents out there, anyone’s kids play Roblox? Yesterday my son was playing, and I saw on the chat in the corner,someone was asking him to go to my purse, take out my credit card and read the numbers!!

‘@Roblox seems a dangerous place for kids. Thankfully he told me!’

Roblox said in response to the criticisms: 'We totally reject the claims made in the report'

Roblox said in response to the criticisms: ‘We totally reject the claims made in the report’

An 11-year-old girl from Wayne in New Jersey was kidnapped and taken across state lines by 27-year-old Darius Matylewich (pictured) to Bear in Delaware, after they chatted on Roblox

An 11-year-old girl from Wayne in New Jersey was kidnapped and taken across state lines by 27-year-old Darius Matylewich (pictured) to Bear in Delaware, after they chatted on Roblox

Hindenburg found multiple accounts named using variations of the name of disgraced paedophile financier Jeffrey Epstein (pictured with Ghislaine Maxwell in New York in 2005)

Hindenburg found multiple accounts named using variations of the name of disgraced paedophile financier Jeffrey Epstein (pictured with Ghislaine Maxwell in New York in 2005)

She was flooded with messages from other worried parents, while broadcaster Jeremy Vine replied: ‘Did a piece on this @BBCRadio2! Horrible stories.

‘One 13-year-old encouraged to run away from home to meet ‘a teenage girl’ who turned out to be a large man in the Czech Republic.’

Earlier that same year, Kim Kardashian threatened to sue Roblox after her son Saint, then aged six, saw an ad for her ‘unreleased’ sex tape.

The reality TV star was left shocked as she saw Saint laughing at his iPad while playing Roblox, only to see a picture of her crying face on the screen captioned with the words ‘Kim’s New Sex Tape’.

She said at the time: ‘Thank God he can’t f***ing read yet.

‘And it’s like, over my dead body is this s**t going to happen to me again. I just want it gone. This is not gonna f*** with me. It’s not, so I just want it gone.’

Roblox said it removed the room and banned its creator – and no sex tape was ever available.