Credited from: BBC
Meta's Instagram announced a new safety feature that will alert parents if their teenage children repeatedly search for terms related to suicide or self-harm within a short timeframe. This initiative is part of a broader effort to mitigate concerns surrounding the impact of social media on mental health among young users, particularly as pressure mounts on companies to enhance child safety online. The alerts will begin rolling out next week in the United States, the United Kingdom, Australia, and Canada, according to CBS News, BBC, and Reuters.
Parents utilizing Instagram's child supervision tools will receive notifications via various methods, including email, text, and the Instagram app. Unlike previous measures that only blocked searches and redirected users to external resources, this proactive alert system aims to keep parents informed of their child's online activity concerning dangerous content. Meta highlighted that the alerts include expert resources to aid parents in handling sensitive discussions about mental health, according to BBC and Reuters.
Following the announcement, governments have increased scrutiny over social media platforms, particularly in light of rising concerns over young people's safety online. Following Australia's lead in banning social media access for individuals under 16, various countries, including the UK, Spain, and Greece, are considering similar regulations to enhance children's online safety. These developments come amid ongoing debates about the responsibilities of social media companies regarding their younger users, as illustrated by recent legal scrutiny faced by Meta's leadership, according to CBS News and Reuters.