According to a new report released Wednesday by the Center for Countering Digital Hate (CCDH), TikTok is recommending self-harm and eating disorder content to some users within minutes of joining the platform.
In the new study, researchers had set up TikTok accounts impersonating 13-year-old users interested in content about body image and mental health. It turned out that TikTok’s algorithm recommended suicidal content within just 2.6 minutes of joining the app. The report showed that eating disorder content was recommended in just 8 minutes.
Over the course of this study, researchers found 56 TikTok hashtags hosting eating disorder videos with over 13.2 billion views.
“The new report from the Center for Countering Digital Hate underscores why it’s high time for TikTok to take action against the platform’s dangerous algorithmic reinforcement,” said James P. Steyer, founder and CEO of Common Sense Media associated with TikTok is the study. “TikTok’s algorithm bombards teens with harmful content that promotes suicide, eating disorders and body image issues, fueling teen mental health crisis.”
Launched globally in 2017 by Chinese company ByteDance, TikTok, which operates on algorithms based on personal data — likes, follows, watch time, a user’s interests — has become the world’s fastest-growing social media app, reaching one billion more monthly active users Apps users by 2021.
The CCDH report describes how TikTok’s algorithms refine the videos shown to users while the app gathers more information about their likes and interests. The algorithmic suggestions in the “For You” feed are, as the app puts it, “central to the TikTok experience.” However, new research shows that the video platform can leak malicious content to vulnerable users when trying to maintain interest.
To test the algorithm, CCDH researchers registered users in the United States, United Kingdom, Canada and Australia and created “standard” and “vulnerable” accounts on TikTok. A total of eight accounts were created and data was collected from each account for the first 30 minutes of use. According to the CCDH, the small recording window was created to show how quickly the video platform can understand each user and expel potentially harmful content.
In the report, each researcher impersonating a 13-year-old, the minimum age that TikTok allows to sign up for its service, created two accounts in their specific country. An account was given a female username. The other, a username hinting at body image concerns — the name included the phrase “lose weight.” Across all accounts, researchers paused briefly at videos about body image and mental health. They “liked” these videos as if they were teenagers interested in this content.
When comparing the loseweight account to the standard account, the researchers found that loseweight accounts served three times more malicious content and 12 times more self-harm and suicide-related videos overall than the standard accounts.
“TikTok is able to detect user vulnerabilities and tries to exploit them,” said Imran Ahmed, CEO of CCDH, which advocates for the Kids Online Safety Act (KOSA) in Washington DC, the guard rails to protect minors in the internet Internet would introduce. “That’s part of what makes TikTok’s algorithms so insidious; the app is constantly testing our kids’ psychology and adapting to keep them online.”
Among the content that was leaked to the compromised accounts was a video with the caption: “Making every think your [sic] good so you can try it in private”.
The video, which suggests a suicide attempt, garnered 386,900 likes. The report also included video of a teenage girl crying with words on the screen: “You’re not thinking about taking your own life, are you?” And then it references a TV character named Sarah Lynn who appeared in the Netflix animated series Bojack Horseman dies of drug overdose This video received 327,900 likes. And another with a link to PrettyScale.com, a site where users upload pictures of the body and face to rate their attractiveness using a “mathematical formula.” The video had 17,300 likes.
When asked for comment, a TikTok spokesperson questioned the study’s methodology.
“We regularly consult with health professionals, eliminate violations of our policies, and provide access to supportive resources for all those in need,” the representative said.
The TikTok spokesperson went on to say the video platform is “keen that triggering content is unique to each individual” and that the social platform “remains[s] focuses on creating a safe and comfortable space for everyone.”
As 60 minutes reported on Sunday, This study comes as more than 1,200 families are filing lawsuits against social media companies, including TikTok. These lawsuits allege that content on social media platforms profoundly impacted their children’s mental health and, in some cases, contributed to their children’s deaths. More than 150 lawsuits are expected next year.
If you or someone you know is experiencing emotional distress or a suicidal crisis, call the National Suicide Prevention Hotline at 1-800-273-TALK (8255).
For more information on mental health care resources and support, call the National Alliance on Mental Illness (NAMI) HelpLine Monday through Friday, 10:00 a.m. – 6:00 p.m. ET at 1-800-950-NAMI (6264) or by email to [email protected]