Credited from: BBC
Meta-owned Instagram announced that it will begin notifying parents if their teenage children repeatedly search for terms related to suicide or self-harm. This initiative marks a proactive approach to parental supervision, allowing parents to be informed and support their children effectively during mental health crises, according to CBS News and BBC.
The alert system will kick-off next week for parents using Instagram's supervision tools in the US, UK, Australia, and Canada. It aims to close the gap between teenagers' online activities and parents' ability to intervene. Alerts will also come equipped with expert resources to guide parents on discussing these sensitive topics with their children, as reported by India Times and Reuters.
However, responses to these measures have been mixed. The Molly Rose Foundation criticized the notifications, stating that they could induce panic among parents and fail to prepare them for the ensuing difficult conversations. Despite this critique, Meta insists that it is attempting to provide necessary support to parents and teens in managing online safety, as highlighted by LA Times and AA.
The upcoming alert system will function by detecting repeated searches for harmful terms and notifying parents through email, text, or in-app alerts. Instagram's existing policies already block these searches and redirect users to support resources. The safety feature is part of a broader effort to protect teens on social media amidst increasing governmental scrutiny worldwide, particularly following Australia's recent social media regulations for users under 16, according to BBC, Reuters, and LA Times.