TikTok will implement new age-verification technology throughout the European Union in the coming weeks. This initiative coincides with growing calls across various countries, including the UK, for stricter regulations on social media use among minors, similar to a recent ban in Australia that prohibits users under the age of 16.
The age-verification system, which has been in a pilot phase across the EU for nearly a year, uses advanced algorithms to analyze user profiles, video content, and behavioral patterns. This analysis aims to identify accounts belonging to individuals under the age of 13. TikTok has stated that flagged accounts will undergo review by specialized moderators rather than face immediate bans, allowing for a more nuanced approach to account management. In previous tests in the UK, this system resulted in the removal of thousands of accounts.
In December 2023, Australia instated a ban on social media access for users under 16. Following this policy, the country’s eSafety Commissioner reported the removal of over 4.7 million accounts across ten platforms, which include TikTok, Instagram, and Facebook, since the ban took effect on December 10, 2023. Such measures have sparked discussions in Europe regarding the adequacy of age-verification processes as authorities seek to enhance compliance with data protection regulations.
Increasing Scrutiny of Social Media Use Among Young People
Recent comments from UK Labour leader Keir Starmer indicate a shift in perspective regarding social media for young users. Starmer expressed concerns during a parliamentary session about the excessive time children and teenagers spend on smartphones. He highlighted alarming reports of five-year-olds spending hours on screens daily and voiced growing worries about the negative impact of social media on users under 16.
Previously, Starmer had opposed outright bans, arguing that such measures would be difficult to enforce and could inadvertently drive teens towards more dangerous online spaces, such as the dark web. His change in tone reflects a broader societal concern about the safety of young users online.
Calls for enhanced parental rights regarding access to deceased children’s social media accounts have also emerged in the wake of tragic incidents. Notably, Ellen Roome, whose 14-year-old son Jools Sweeney died following an online challenge, has advocated for more transparency and rights for parents in such circumstances.
Regulatory Efforts Across Europe
The European Parliament is actively pursuing legislation aimed at establishing age limits for social media platforms. In particular, Denmark has proposed a ban on social media usage for individuals under 15. TikTok has indicated that its new age-verification technology has been specifically designed to align with the EU’s regulatory framework. The company has collaborated with Ireland’s Data Protection Commission, the lead EU privacy regulator, in developing this system.
In a related investigation, a report from The Guardian in 2023 raised questions about TikTok’s existing moderation practices. It revealed that moderators were instructed to allow users under 13 to remain on the platform if they claimed parental oversight of their accounts. This finding underscores the ongoing challenges and scrutiny that social media platforms face in ensuring safe environments for younger users.
As TikTok moves forward with its enhanced age-verification measures, the platform’s response will be closely monitored by regulators and advocates alike, highlighting the critical importance of safeguarding children in the digital age.