The European Union has initiated a formal investigation into TikTok on allegations of failing to uphold its responsibilities in safeguarding minors online, in accordance with a groundbreaking new law governing digital content. This investigation marks the second inquiry into a major online platform since the introduction of the Digital Services Act (DSA), with Elon Musk’s X being the first target in December.
Brussels’ primary concern revolves around TikTok, owned by China’s ByteDance, potentially inadequately addressing the adverse effects on young users. Of particular worry is the “rabbit hole” effect, where users are exposed to increasingly risky content through algorithmically generated recommendations.
The European Commission’s apprehensions extend to TikTok’s age verification mechanisms, which it deems potentially insufficient in their reasonableness, proportionality, and effectiveness. Formal proceedings have been launched to evaluate whether it has violated the DSA in areas such as advertising transparency and data accessibility for researchers.
This action follows an examination of a risk assessment report submitted by TikTok and the platform’s responses to Brussels’ inquiries regarding its measures against illicit content, protection of minors, and data accessibility.
TikTok
Regulators will continue gathering evidence, with the commission emphasizing its authority to take further enforcement actions if deemed necessary. Internal Market Commissioner Thierry Breton stressed it’s significant reach among children and teenagers, emphasizing the platform’s obligation to comply fully with the DSA and its pivotal role in safeguarding minors online. Breton reiterated the commission’s commitment to ensuring the physical and emotional well-being of young Europeans and emphasized the imperative to spare no effort in protecting children. It boasts over 142 million monthly users across the EU, marking a notable increase from 125 million the previous year.