How TikTok’s algorithm works is probably the company’s most closely guarded secret. But the WSJ thinks it was able to work it out using a bunch of bot accounts – and the results were both fascinating and disturbing.
A former Google engineer who worked on YouTube’s algorithm says that it takes essentially the same approach, but in a less extreme form …
Officially, TikTok says that it uses four signals: what you watch, share, like, and follow. But the analysis suggested that one of the four is by far the most important …
The WSJ created a 13-minute video to share its findings.
We found out that TikTok only needs one of these to figure you out: How long you linger over a piece of content.
Every second you hesitate or rewatch, the app is tracking you. Through this one powerful signal, TikTok learns your most hidden interests and emotions, and drives you deep into rabbit holes of content that are hard to escape.
The TikTok experience starts the same way for everyone. Open the app and you’ll immediately see an endless string of videos in your For You feed. TikTok starts by serving the account a selection of very popular videos vetted by app moderators.
From these, it works to identify your interests.
The WSJ programmed bots to have an age, location, and a set of particular interests. Those interests weren’t ever entered into the app, just used as the basis for choosing the videos that the bot watched. The bot checked each video in its feed for hashtags or AI-identified images relating to its interests. It would then stop scrolling to watch these videos, and rewatch some of them.
What the paper found was that the video selections and view counts contracted, from popular videos about anything, to ones tightly focused on the interests it had identified.
The results were analysed by data scientist and algorithm expert Guillaume Chaslot, a former Google engineer who worked on YouTube’s algorithm.
He’s now an advocate for algorithm transparency. He says TikTok is different from other social media platforms.
“The algorithm on TikTok can get much more powerful and it can be able to learn your vulnerabilities much faster.”
In fact, TikTok fully learned many of our accounts’ interests in less than two hours. Some it figured out in less than 40 minutes.
One bot was programmed with sadness and depression as “interests.”
Less than three minutes into using TikTok, at its 15th video, [bot] kentucky_96 pauses on this [sad video about losing people from your life]. Kentucky_96 watches the 35-second video twice. Here TikTok gets its first inkling that perhaps the new user is feeling down lately.
The information contained in this single video provided the app with important clues. The author of the video, the audio track, the video description, the hashtags. After kentucky_96’s first sad video, TikTok serves another one 23 videos later – or after about four more minutes of watching.
This one is a breakup video with the hashtag #sad. TikTok’s still trying to suss out this new user, with more high view count videos [but] at video 57, kentucky_96 keeps watching a video about heartbreak and hurt feelings. And then at video 60, watches one about emotional pain.
Based on the videos we watched so far, TikTok thinks that maybe this user wants to see more about love, breakups and dating. So at about 80 videos and 15 minutes in, the app starts serving more about relationships. But kentucky_96 isn’t interested.
The user instead pauses on one about mental health, then quickly swipes past videos about missing an ex, advice about moving on, and how to hold a lover’s interest. But kentucky_96 lingers over this video containing the hashtag #depression, and these videos about suffering from anxiety.
After 224 videos into the bot’s overall journey, or about 36 minutes of total watch time, TikTok’s understanding of kentucky_96 takes shape. Videos about depression and mental health struggles outnumber those about relationships and breakups. From here on, kentucky_96’s feed is a deluge of depressive content. 93% of videos shown to the account are about sadness or depression.
TikTok says the bot behavior isn’t representative of real users, but even bots with diverse interests programmed into them very quickly saw their content feed narrowed dramatically.
Given the massive popularity of TikTok among teenagers in particular, this raises obvious concerns. Someone in a depressive state could easily be made even more depressed by watching a stream of such content. A conspiracy theorist could end up with the impression that such views are mainstream. Such an algorithm is also likely to push those with extremist views to increasingly extreme content.
TikTok appears to be prioritising engagement over mental health, and Chaslot says that YouTube does something similar, but in a less extreme way. That certainly matches our experience, where watching a few videos in a row on the same topic often seems to convince YouTube that this is now our sole interest in life – though it does at least reset.
MARKETING Magazine is not responsible for the content of external sites.