EU starts investigation against TikTok due to possible breaches of the Digital Services Act

The DSA, which became effective on February 17 for all platforms operating in the EU, governs the handling of illegal and harmful content by online entities. Now, the European Commission recently started proceedings against TikTok after potential violations in several areas, including child protection. 

This follows a preliminary investigation prompted by a risk assessment report submitted by TikTok to the Commission. Additionally, formal Requests for Information on issues such as illegal content, protection of minors, and data access were issued by the EU institution.

Thierry Breton, the European Commissioner for Internal Market, said in a post on X that the ongoing investigation will target concerns about transparency and the safeguarding of minors. This includes aspects such as addictive design, screen time limitations, age verification, and default privacy settings.

According to a report by Reuters, Breton said, “The protection of minors is a top enforcement priority for the DSA. As a platform that reaches millions of children and teenagers, TikTok must fully comply with the DSA and has a particular role to play in the protection of minors online.”

Platforms with over 45 million monthly users are automatically considered a systemic risk in terms of child safety under the DSA, which means they need certain content moderation measures.

The Commission also stressed that DSA compliance requires platforms to assess and mitigate systemic risks. However, it raised concerns that existing measures, including TikTok’s age verification tools, may not be sufficiently reasonable, proportionate, or effective.

Furthermore, the Commission highlighted the need for platforms to have appropriate measures to protect minors, including tailored default privacy settings within recommender systems. Compliance also means setting up a searchable and reliable repository for TikTok advertisements.

The enquiry into TikTok’s platform transparency efforts focuses on potential shortcomings in providing researchers access to publicly available data, as mandated by the DSA.

Concerning addictive design, platforms are criticised for intentionally encouraging addiction through features like the “rabbit hole effect,” wherein algorithms show users more of a certain type of content based on their interactions, potentially leading to excessive screen time.

The Commission’s next steps involve conducting an in-depth investigation to gather evidence of potential DSA violations by TikTok. This could result in enforcement actions such as interim measures or non-compliance decisions, or TikTok may propose remedies. However, the timeline for such investigations is variable and not constrained by a specific deadline.

 

Please follow and like us: