Instagram on Thursday announced the need of notifying parents about their childrens' behaviour, along with the searches they do on social media platforms.
Meta-owned social media platform, in an announcement, said Instagram will notify parents if their teenager repeatedly searches for terms related to suicide or self-harm within a short period, as pressure grows for governments to follow Australia's ban on the use of social media for under 16s.
Parents — who are signed up to its optional supervision setting — will now get alert, if their children try to access suicide or self-harm content. "These alerts build on our existing work to help protect teens from potentially harmful content on Instagram," the platform said in a statement. "We have strict policies against content that promotes or glorifies suicide or self-harm."
Instagram's existing policy is to block such searches and redirect users to give resources. The platform further added that it would begin the alerts from next week for those signed up in the United States, Britain, Australia and Canada.
Governments are looking for ways to protect teens from harm online, particularly after worries over the AI chatbot Grok, which has generated non-consensual sexualised images.
Instagram's "teen accounts" for under-16s need a parent's permission to change settings, while parents can select an extra layer of monitoring with the agreement of their teenager.