The report finds that the algorithm is so powerful and unlike anything the world has seen before, making it very hard to break the cycle due to a design that sees users never really get to the end of the content. In the investigation, researchers spoke with two young Aussie women who had pro-eating disorder content repeatedly appear in their feeds. As one 19-year-old told them, “Before TikTok, calorie counting had never crossed my path.”
Another 22-year-old who had been in and out of hospital due to an eating disorder over the past five years explained, “As I got sicker and I got more obsessive, all I could do was just flick through my phone and look at this footage.” She added, “I spent hours on it and just fixated on it.”
Despite TikTok claiming that there are in fact mechanisms in place to stop the spread of this kind of content, it appears there are ways to get around such safety mechanisms. Currently, searching for terms related to eating disorders doesn’t return any actual videos, but instead links to the Butterfly Foundation’s helpline. “Our teams consult with NGOs and other partners to continuously update the list of keywords on which we intervene,” a TikTok spokesperson told the ABC. The app also bans “content depicting, promoting, normalising, or glorifying activities that could lead to suicide, self-harm, or eating disorders.”
Still, simply by using deliberate misspellings and coded language means users are able to navigate around these safety measures, allowing the harmful content to continue popping up int he feeds of those online, in their very own For You pages. It’s estimated that 4 per cent of Australians - or roughly one million people - are affected by eating disorders. Of these, almost two thirds (63 per cent) are thought to be female. Teenagers with eating disorders are more likely to experience poor mental health and impaired functioning in social environments, and research has repeatedly shown that social media can exacerbate these issues, flooding teens with images of “ideal” body types that are often unattainable, and filtered images.
Interestingly, the flip side of the algorithm suggests that while the app fails to moderate this negative content, it can easily hide content which constructively discusses issues like racism and disabilities. Perth TikToker Unice Wani told the publication that despite having over 595,000 followers, when she spoke about race and racism as a Black woman, her videos did not perform well. “You tend to get a lot of shadow bans for speaking up about stuff such as racism,” she told Four Corners. “I guess they focus more on the white girls dancing and stuff like that.” It comes after a number of Black creators from the US went on strike earlier this year, protesting the fact that white TikTokers were becoming hugely successful off of their dance moves, while the same success rarely ever extended to themselves.
To read the full investigation, visit the story over at the ABC’s official website here.
If you need support, give Butterfly Foundation a call on 1800 33 4673 or chat online.
If you are in distress, please call Lifeline on 13 11 14 or chat online.
Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.