TikTok Algorithms Reportedly Promote New Videos Educating on Eating Disorders and Self-Harm

TikTok algorithms have a caring approach for their young users. Social media does not encourage individuals younger than 13 years to sign up for the video streaming service. It also prohibits users from making videos, encouraging suicide or eating disorders. 

A report about TikTok published on Wednesday reveals the purpose behind social media’s algorithms for videos. They aim to promote clips concerning eating disorders and self-harm to vulnerable age groups like teenagers. 

Researchers Made TikTok Accounts to Check Algorithms 

The report highlights many concerns about TikTok and its devastating effects on teenagers’ mental health. Center for Countering Digital Hate researchers created accounts on social media for fictional teens in Canada, Australia, the United Kingdom, and the United States. These individuals using the accounts then began pressing the like button on the videos about eating disorders and self-harm to check how social media’s algorithms respond. 

Earlier, the wild platform would recommend videos about weight loss and self-harms, including those that discussed suicide, displayed razor blades, featured models’ pictures, and idealized their body types when users searched for such videos.

Researchers of the non-profit organization signed in for the services with user names, suggesting a specific vulnerability to eating disorders. The organization has branches in the United States and the United Kingdom. They added the words like “weight loss” in the user names and posted more harmful content through these TikTok accounts. 

According to Mr. Imran Ahmed, the CEO of the Center for Countering Digital Hate, these videos are like someone getting stuck in a hall of distorted mirrors, where they are constantly being told negatively about their personalities. 

If you tell a teenager that they are not smart enough, are ugly or cannot do anything and that maybe they should go away or kill themselves. Such negative thoughts are nothing else than conveying the most dangerous messages to youth, according to the CEO. 

Algorithms Discourage Harmful Content 

TikTok algorithms identify topics and interests to a user to work who receives more of the same to increase their time on social media. According to social media critics, these algorithms promote videos about a particular hobby, dance, or sports team and can send users a series of harmful videos.

Fairplay’s executive director, Josh Golin, said, “Content concerning self-harm and eating disorders is a specific problem for children and teenagers, who tend to spend hours online. Such users could be vulnerable to peer pressure, bullying, or posts about suicide or eating disorders”

Fairplay is a non-profit organization that supports online protections for teens and children. TikTok is like other social media platforms that fail to protect teens from aggressive data and harmful content. “These harms have links to the business model irrespective of what social media it is”, Golin added.

TikTok Disputes the Findings 

TikTok decided to dispute the findings, realizing the researchers used the platform, unlike typical users. The social media site said no user name should affect the kind of content the owner receives. TikTok does not allow teenagers younger than 13 to be a part of social media. Also, it gives the US users searching for content about eating disorders on the site prompt showing mental health resources.