Credited from: SCMP
Instagram announced that it will restrict teenagers to viewing PG-13-rated content by default, with changes effective immediately. This means teenagers using accounts designed for them will see content similar to what is permissible in PG-13 movies, specifically excluding material with sex, drugs, or dangerous stunts. The intent behind this update is to create a safer online experience for adolescents, according to Sfgate and SCMP.
To achieve these standards, Meta has introduced several measures, including the hiding of posts containing strong language and risky stunts, as well as restricting content that may promote harmful behaviors, like marijuana paraphernalia. The updates reflect Meta's response to persistent criticism regarding its failure to adequately protect younger users from inappropriate content. A spokesperson stated that the adjustments are meant to reassure parents as they strive to show teens content that is safe and appropriate for their age, according to Reuters and CBS News.
Instagram's new policies will automatically categorize teen accounts under a PG-13 setting, meaning that opting out will require parental consent. Furthermore, teens will be barred from interacting with accounts that frequently share content deemed inappropriate for their age, such as those linked to OnlyFans. If teens already follow such accounts, they won't be able to view or engage with the content or send direct messages, as per Africa News and Reuters.
In addition to these restrictions, Meta has committed to blocking search terms related to sensitive topics, like suicide and eating disorders, but will expand this to include other terms, such as "alcohol" and "gore." This updated policy will also apply to artificial intelligence interactions within Instagram, ensuring that AIs do not provide age-inappropriate responses. These measures respond to earlier criticisms regarding inadequate content safety measures for young users, as highlighted by multiple reports, according to CBS News and Africa News.
The rollout of these new content restrictions is expected to focus initially on the U.S., UK, Canada, and Australia, with plans for further expansion. This update follows similar initiatives by other platforms, aiming to enhance protections for young users against potentially harmful content on social media, as noted by SCMP and Africa News.