A controversial move is underway as TikTok gears up to implement advanced age-verification technology across the EU. This development comes amidst growing calls for an Australia-style social media ban for under-16s, a debate that has sparked intense discussions in countries like the UK.
ByteDance-owned TikTok, along with other major platforms popular among young users, such as YouTube, is facing mounting pressure to enhance its ability to identify and remove accounts belonging to children.
The new system, which has been quietly tested in the EU over the past year, employs a sophisticated approach. It analyzes profile information, posted videos, and behavioral patterns to predict whether an account might belong to a user under the age of 13.
TikTok has clarified that accounts flagged by this system will be carefully reviewed by specialist moderators, and only then may they be removed. The UK pilot, for instance, led to the removal of thousands of accounts.
Meta, the parent company of Facebook and Instagram, has also taken steps in this direction. It utilizes the verification company Yoti to verify users' ages on Facebook.
In December, Australia implemented a social media ban for people under the age of 16. The country's eSafety commissioner revealed that since the ban's implementation on December 10, over 4.7 million accounts have been removed across ten platforms, including YouTube, TikTok, Instagram, Snap, and Facebook.
The rollout of TikTok's new system coincides with European authorities' increased scrutiny of how platforms verify users' ages under data protection rules.
Keir Starmer, the UK's Labour Party leader, recently expressed openness to a social media ban for young people in the UK. This shift in stance was influenced by concerns over the excessive time children and teenagers spend on their smartphones and the potential harm social media can inflict on under-16s.
However, Starmer's previous opposition to such a ban, based on the belief that it would be challenging to enforce and could drive teenagers towards the dark web, adds an intriguing layer to the debate.
Ellen Roome, whose 14-year-old son Jools Sweeney died after an online challenge went awry, has called for more rights for parents to access their children's social media accounts in the event of their death.
The European Parliament is advocating for age limits on social media, while Denmark is pushing for a ban on social media for those under 15.
TikTok has emphasized that the new technology is specifically designed to meet the EU's regulatory requirements. The company has collaborated with Ireland's Data Protection Commission, its lead EU privacy regulator, during the system's development.
In 2023, a Guardian investigation revealed that moderators were instructed to allow under-13s to remain on the platform if they claimed their parents were overseeing their accounts.
This ongoing debate raises important questions about the balance between online safety and privacy, especially for young users. What are your thoughts on this? Should there be stricter age limits on social media platforms? Or is there a better way to ensure the safety of young users online?