TikTok fined in Italy after ‘French scar’ challenge led to consumer safety probe

Italy’s competition and consumer authority, the AGCM, has fined TikTok €10 million (almost $11 million) following a probe into algorithmic safety concerns.

The authority opened an investigation last year into a “French scar” challenge in which users of the platform were reported to have shared videos of marks on their faces made by pinching their skin.

In a press release Thursday, the AGCM said three regional companies in the ByteDance group, Ireland-based TikTok Technology Limited, TikTok Information Technologies UK Limited and TikTok Italy Srl, had been sanctioned for what it summarized as an “unfair commercial practice.”

“The company has failed to implement appropriate mechanisms to monitor content published on the platform, particularly those that may threaten the safety of minors and vulnerable individuals. Moreover, this content is systematically re-proposed to users as a result of their algorithmic profiling, stimulating an ever-increasing use of the social network,” the AGCM wrote.

The authority said its investigation confirmed TikTok’s responsibility in disseminating content “likely to threaten the psycho-physical safety of users, especially if minor and vulnerable,” such as videos related to the “French scar” challenge. It also found the platform did not take adequate measures to prevent the spread of such content and said it failed to fully comply with its own platform guidelines.

The AGCM also criticized how TikTok applies the guidelines — which it says are applied “without adequately accounting for the specific vulnerability of adolescents.” It pointed out, for example, that teens’ brains are still developing and young people may be especially at risk as they can be prone to peer pressure to emulate group behavior to try to fit in socially.

The authority’s remarks particularly highlight the role of TikTok’s recommendation system in spreading “potentially dangerous” content, pointing out the platform’s incentive to drive engagement and increase user interactions and time spent on the service to boost ad revenue. The system powers TikTok’s “For You” and “Followed” feeds and is, by default, based on algorithmic profiling of users, tracking their digital activity to determine what content to show them.

“This causes undue conditioning of users who are stimulated to increasingly use the platform,” the AGCM suggested in another remark that’s notable for being critical of engagement driven by profiling-based content feeds.

We’ve reached out to the authority with questions. But its negative assessment of the risks of algorithmic profiling looks interesting in light of renewed calls by some lawmakers in Europe for profiling-based content feeds to be off by default.

Civil society groups, such as the ICCL, also argue this would shut off the outrage tap that ad-funded social media platforms monetize through engagement-focused recommender systems, which have a secondary effect of amplifying division and undermining societal cohesion for profit.

TikTok disputes the AGCM’s decision to issue a penalty.

In a statement, the platform sought to play down its assessment of the algorithmic risks posed to minors and vulnerable individuals by framing the intervention as related to a single controversial but small-scale challenge. Here’s what TikTok told us:

We disagree with this decision. The so-called “French Scar” content averaged just 100 daily searches in Italy prior to the AGCM’s announcement last year, and we long ago restricted visibility of this content to U18s, and also made it ineligible for the For You feed.

While the Italian enforcement is limited to one EU member state, the European Commission is responsible for overseeing TikTok’s compliance with algorithmic accountability and transparency provisions in the pan-EU Digital Services Act (DSA) — where penalties for noncompliance can scale up to 6% of global annual turnover. TikTok was designated as a very large platform under the DSA back in April last year, with compliance expected by late summer.

One notable change as a result of the DSA is TikTok offering users non-profiling based feeds. However, these alternative feeds are off by default — meaning users remain subject to AI-based tracking and profiling unless they take action themselves to shut them off.

Last month the EU opened a formal investigation of TikTok, citing addictive design and harmful content and the protection of minors as among its areas of focus. That procedure remains ongoing.

TikTok has said it looks forward to the opportunity to provide the Commission with a detailed explanation of its approach to safeguarding minors.

However, the company has had a number of earlier run-ins with regional enforcers concerned about child safety in recent years, including a child safeguarding intervention by the Italian data protection authority; a fine of €345 million last fall over data protection failures also related to minors; and long-running complaints from consumer protection groups that are worried about minor safety and profiling.

TikTok also faces the possibility of increasing regulation by member state–level agencies applying the bloc’s Audiovisual Media Services Directive. Such as Ireland’s Coimisiún na Meán, which has been considering applying rules to video sharing platforms that would require recommender algorithms based on profiling to be turned off by default.

The picture is no brighter for the platform over in the U.S., either, as lawmakers have just proposed a bill to ban TikTok unless it cuts ties with Chinese parent ByteDance, citing national security and the potential for the platform’s tracking and profiling of users to provide a route for a foreign government to manipulate Americans.