Your browser is not supported. We do our best to optimize our websites to the most current web browsers. Please try another browser.

The Dark Side of TikTok – Here’s Why Eating Disorder Content Can Often Go Unchecked: Research

A person's hand holding a phone with the TikTok app open

Warning: This article deals with potentially triggering content around eating disorders. If you or someone you love is struggling with an eating disorder, call the National Eating Disorder Information Centre (NEDIC) at the toll-free helpline at 1-866-633-4220.

While TikTok can bring you many positive tips and tricks – from skincare tips to sleep hacks – there’s also a darker side to the app that can often go unchecked, especially when it comes to eating disorders. A new study from Within Health explores how harmful eating disorder content finds its way into users’ feeds.

Last year, the Wall Street Journal (WSJ) published a series of articles that looked at the TikTok algorithm in an attempt to understand how and why it serves harmful content to minors. After reading the articles, a team of researchers at Within Health comprised of healthcare providers (who work with patients who have disordered eating struggles) decided to look at the TikTok algorithm themselves “to understand how harmful content is still finding its way into our feeds while at the same time TikTok is telling the world that they are solving these issues.”

In response to the WSJ articles, TikTok implemented a series of measures to make harmful eating disorder-related content less prevalent, but new research shows the ways that content creators have found loopholes in the algorithm to still promote such content.

Related: Charli D’Amelio discusses how TikTok contributed to her eating disorder.

A person filming themselves with a ring light

Three things are at the centre of TikTok’s algorithm issues

According to the research, there are three things at the heart of TikTok’s algorithm issues, which allow harmful content to make its way onto users’ “For You” pages. These three things are: “the unhealthy content itself,” “limitations with TikTok’s current content search and suggestion algorithm, which allows the company to serve up content to its users” and “problems with the filters TikTok employs to curb harmful content from getting to its users.”

In particular, the study focused on the latter two issues, as it is difficult to pinpoint exactly why some content creators choose to post potentially harmful or triggering videos.


What is filter evasion?

One way content creators and consumers get around TikTok’s current filters is by using “misspellings and misused keywords.”

TikTok currently has many (but definitely not all) terms around disordered eating on a “block” list, but content creators are able to surpass these filters by intentionally misspelling words. Users also work around the block list by using homoglyphs: characters that look similar to normal letters to fool the filters.

According to researchers, something that’s particularly harmful is how “TikTok’s search engine will often correctly match incorrect words with their correctly spelled counterparts.” So, if a user searches for the keyword “anorexic,” which is not a blocked term, the results will still feature videos that include blocked tags, such as “anorexia.”

“When TikTok’s algorithm is so advanced in serving content ‘inspired by’ its users’ interests, it seems reasonable to expect more can be done to protect users against harmful content,” the study details. “Whether those users are stumbling on this content accidentally or intentionally using misspelled search terms to get around the filters, there is plenty of disordered eating content still being served up.”

TikTok open in a computer browser

Users will also exploit hashtags by searching outside of the TikTok app to find a way around the blocks. For example, users can stumble upon videos using blocked hashtags through Google and web searches, even though they cannot find them in-app.

“The keyword ‘anorexic’ is a prime example of this,” the study reveals. “The videos using this hashtag have more than 20 million views collectively, which is both staggering and frightening.”

Additionally, many obvious eating disorder-related terms actually haven’t made their way to TikTok’s block list, such as “orthorexia” (an eating disorder related to an obsession with healthy eating).


Related: What is an ‘Almond Mom’? TikTokers oppose generational diet culture after old ‘RHOBH’ clip goes viral.

How can TikTok help stop filter evasion and the exploitation of hashtags?

The researchers suggest a simple, practical solution in response to filter evasion: “updating block list filtering with a complete list of keywords directly related to disordered eating that are still live on-site.”

In fact, the researchers gathered a wide selection of non-blocked hashtags related to eating disorders on the app. From misspelled to unblocked keywords, these hashtags have accumulated more than 1.3 billion views to date.

To avoid the exploitation of hashtags, TikTok could also “apply blocklist filtering evenly across all search options, including video, tag, and user searches, as well as between web and app searches.”

In response to misspellings, researchers suggest creating “a crowdsourced list of additional keywords to add to the TikTok blocklist based on iterations of misspelling techniques and homoglyph obfuscation.”

A phone on a table with the TikTok app open

What are some of the other flaws in the TikTok algorithm when it comes to disordered eating?

One other blatant issue researchers discovered within the TikTok algorithm is the autocomplete feature. While autocomplete can be helpful, allowing users to find what they’re looking for without having to type out a full key phrase, it can be very harmful when it comes to eating disorder content.

The autocomplete feature can actually help users uncover keywords that evade filtering and the block list.

“Popular hashtags like ‘diet’ can also introduce users to related keywords that could be considered harmful to ED sufferers,” the researchers add. “For example, within our test account, the keyword ‘diet’ surfaced a suggestion for the key phrase ‘diet hacks to lose a lot of weight.’”

This ultimately makes discovering the most dangerous hashtags far easier and more accessible. Similarly, smart complete can also serve users triggering content through association.


See also: Are you addicted to TikTok? A new study identifies the signs.

“On a test account that had previously searched for filtered eating disorder keywords, we found that autocomplete sometimes provided suggestions for problematic keywords with as little as a single letter. This is likely a result of having performed similar searches in the past. Repeated suggestions TikTok’s autocomplete of troublesome past searches can reinforce old habits or expose users to content they are trying to avoid,” they said.

Shockingly, TikTok’s algorithm does not yet discern eating disorder recovery content from content that reinforces disordered eating. Thus, users trying to change or work through old, dangerous habits can often be served up potentially triggering content.

How can you help?

Currently, Within Help is working on a crowd-sourced block list that anyone can contribute to. If you have a keyword that you believe should be blocked, add it to the list here “in an effort to provide social media with guidance on trending, obfuscated and dangerous keywords.”

You may also like: Don’t cook your chicken in NyQuil, even if you see it on TikTok: FDA warns.

Latest News

This content is restricted to adults of legal age.
Please enter your birthdate to confirm.
Date of Birth