Credited from: DAWN
Several European nations are pushing for enhanced limits on children's access to social media, emphasizing the harmful effects of online content. The growing prevalence of issues such as dangerous diet tips, cyberbullying, and disinformation has prompted demands for stricter regulations. Greece is spearheading a proposal alongside France and Spain that aims to establish a universal age of digital adulthood across the European Union, which would require parental consent for minors to access social media platforms. This proposal is set to be discussed by EU officials in Luxembourg soon, according to thelocal, bangkokpost, and dawn.
The proposal entails creating a digital adulthood threshold within the EU, which would prevent children from accessing social media without the necessary parental approval. Some countries within the EU, such as France, support a specific under-15 age limit for social media access, while Spain is advocating for a similar restriction for users under the age of 16. The necessity for such measures arises from concerns over the potential negative impact of social media on children’s mental health and developmental skills, highlights thelocal, bangkokpost, and dawn.
France has led initiatives by implementing a law requiring social media platforms to obtain parental consent for users under 15, though it still awaits broader European Union approval. Additionally, there are ongoing efforts to regulate adult content sites, which will also require age verification to prevent underage access. In light of these developments, TikTok recently took action to ban harmful hashtag trends, reflecting a broader trend of platforms responding to governmental pressures, according to thelocal, bangkokpost, and dawn.
In addition to the proposed restrictions, the initiative calls for creating a unified application across the EU that would enforce age verification and support parental controls on devices, suggesting that smartphones come equipped with built-in age verification tools. The European Commission has also signaled the launch of a new age-verification app that respects user privacy, aiming to tackle the rising concerns about children's exposure to harmful online content. This approach follows a publication of draft guidelines intended to safeguard minors that await public consultation, according to thelocal, bangkokpost, and dawn.
Currently, the EU is investigating major platforms such as Meta's Facebook and Instagram and TikTok under its Digital Services Act, amidst fears that these platforms are inadequate in filtering content that may harm children. Additionally, a separate investigation into four adult content platforms aims to address suspicions regarding their failure to prevent minors from accessing inappropriate material. The EU's ongoing discussions regarding child sexual abuse material further complicate the landscape, with privacy concerns arising from data access proposals, according to thelocal, bangkokpost, and dawn.