Parents throughout UK to get new Instagram notifications about their youngsters’ app utilization
Instagram will introduce new notifications for parents so they get an alert if their child repeatedly tries to search for suicide or self-harm terms within a short period of time
Parents will start being notified if their teen repeatedly tries to search for suicide or self-harm terms within a short period of time.
Instagram introduced Teen Accounts or youngsters under 16 in 2024 to limit who can contact teens and to restrict the content they see.
In a further update to the accounts, Instagram will start notifying parents in the coming weeks on teens’ search activity if they enrol in a new supervision tool.
But an online safety campaigner hit out at the “clumsy” plans and warned that the “flimsy” notifications risk leaving parents panicked. They warned the buck must not be passed onto parents and instead tech firms like Instagram should do more to address the risks in the first place.
READ MORE: 7 key details in social media crackdown plan – and how it could affect your kids
Parents will receive an alert if a teen searches phrases promoting suicide or self-harm, phrases that suggest they want to harm themselves, or terms like ‘suicide’ or ‘self-harm’. Instagram already blocks attempts to search for this content but the notifications are intended to make a parent aware if a child is repeatedly trying to find such content.
Instagram said it will also provide parents with information they may need to support their teen and how to approach sensitive conversations.
The notifications will be sent to parents via email, text, or WhatsApp, depending on the contact information available, as well as through an in-app notification.
Alerts will roll out to parents who use Instagram’s parental supervision tools from next week in the US, UK, Australia, and Canada next week, and will become available in other regions later this year.
The UK government has ramped up action in recent weeks to tackle an online harms crisis in the UK. Children are already banned from seeing harmful content like suicide and self-harm material under the Online Safety Act.
But campaigners have long warned that the legislation has holes in it and that further action is needed to ensure children have healthier experiences online.
The Government will next month launch the children’s digital wellbeing consultation to gather evidence on the best solution to the online harms crisis. Under the plans, children face a social media ban and restrictions on addictive apps under plans to boost kids’ safety online.
The three-month consultation will be guided by what parents and children say they need now, not in several years’ time. Technology Secretary Liz Kendall has vowed to act swiftly once the consultation is finished, while Keir Starmer has said he will act in “months, not years” to protect young people from addictive social media.
Andy Burrows, chief executive of Molly Rose Foundation, said: “This clumsy announcement is fraught with risk and we are concerned that forced disclosures could do more harm than good. Every parent would want to know if their child is struggling, but these flimsy notifications will leave parents panicked and ill-prepared to have the sensitive and difficult conversations that will follow.
“Our research shows Instagram’s algorithm still actively recommends harmful depression, suicide and self-harm material to vulnerable young people and the onus should be on addressing these risks rather than making yet another cynically timed announcement that passes the buck to parents.”
