Study reveals how TikTok recommends self-harm videos to young people

Eating disorders, suicide, harassment… The Center for Combating Online Hate (CCDH) revealed, in its latest report, the use of the social network TikTok of a recommendation algorithm promoting the dissemination of self-destructive content to young adolescents. .

The CCDH, an American organization that fights against online hate, has just published this week a study carried out by its researchers in different countries, through the creation of eight fake new accounts on the platform, which has become particularly popular with young people, calling into question the system of video recommendations on TikTok.

He revealed that in 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, the social network offered content related to eating disorders while every 39 seconds, TikTok recommended videos for teens on body image and mental health. According to the NGO, young people’s feeds are bombarded with harmful content that can have a significant cumulative impact on their understanding of the world around them and on their physical and mental health.

Researchers believe that TikTok works using a recommendation algorithm that builds an infinitely scrolling personalized “For You” feed that ” is apparently based on the user’s likes, following, viewing time and interests“. The study showed that users who search for content about eating disorders often choose usernames with associated language and that often contain the term “lose weight”. In this sense, TikTok identifies the vulnerability of the user and takes advantage and exploits it.

Vulnerable accounts in our study received 12 times more self-harm and suicide video recommendations than standard accounts. Young people engaging with this content face a staggering onslaught of recommended videos growing in their feeds”, explained the CEO of the CCDH, Imran Ahmed.

The analysis found that TikTok hashtags hosting content about eating disorders have been viewed more than 13.2 billion times. Some pro-eating disorder content escapes moderation by using coded hashtags or even the name of singer Ed Sheeran. It demonstrates that a new TikTok account created by a 13-year-old user who watches and likes content about body image and mental health will have that content recommended every 39 seconds. Experts have warned that this type of content can have a detrimental effect on teen mental health, even if it doesn’t explicitly promote eating disorders.

Regarding content relating to self-harm, CCHR collected data from the phone of Molly Russell, a teenage girl whose death was linked to harmful social media content about self-harm, which shows that she created a new Twitter account to consume this content with the “Idfc_nomore”, which would mean “I don’t care about anything anymore”. According to the organization, the death of the 14-year-old girl was concluded, for the first time, according to an investigation by the medical examiner in the United Kingdom, by the contribution of social networks to the suicide.

The girl had liked, shared or saved 2,100 posts related to suicide, self-harm or depression on Instagram in the six months before her death. Molly’s investigation showed that the negligence of these platforms has real consequences that affect the lives of those affected and that comprehensive regulation is needed to protect children online.

Previous Post Next Post