A new report highlights how TikTok algorithms are serving up explicit content to minors.
The Wall Street Journal created several fresh TikTok accounts with fake personal information to examine whether younger users were being protected. These accounts were registered as users between age 13 and 15. The bots were turned loose to browse the TikTok ‘For You’ feed, the highly personalized feed curated for users. But what WSJ discovered was that TikTok’s algorithm is an endless funnel for content featuring sex and drugs.
One account in The Wall Street Journal experiment was served 569 videos about drug use. Those videos contained references to cocaine and meth addiction, while others hawked drug paraphernalia for sale. TikTok also showed these teen accounts ads for paid pornography and sex shops.
After the investigation, the WSJ showed TikTok a total of 974 videos with drugs, pornography, or other adult content. But little action was taken to scrub the content.
Only 169 of those videos were removed, but it couldn’t be verified if TikTok or the uploader had removed them. In total, only 255 were removed after being brought to the attention of TikTok, mostly for portraying ‘caregiver relationships.’ Those refer to adults entering relationships with people who pretend to be children.
Then, the shocker. A TikTok spokesperson flatly admitted to the Journal that the app doesn’t differentiate between videos served to adults and those served to minors. However, the spokesperson says TikTok is looking to create a tool to filter content for young users. TikTok tracks engagement by how long a user watches a video, hesitates near it, or re-watches it.
“All the problems we have seen on YouTube are due to engagement-based algorithms, and on TikTok it’s exactly the same – but it’s worse,” says Guillaume Chaslot, a former YouTube engineer. Chaslot says TikTok’s algorithm is capable of learning what content a user finds eye-catching much faster.
The Wall Street Journal also created fake teen profiles populated with interests by watching videos. The bots were programmed to linger on content that mentioned drugs, sex, or other content. The report states that around a dozen of the 31 accounts set up ended up being dominated by the theme they were programmed to linger over.
As if this scene couldn’t get worse, the Journal also found that many of the videos viewed by minor accounts were directing people to OnlyFans. TikTok relies on algorithms and a force of 10,000 moderators to police its growing volume of content. TikTok says it has removed 89 million videos in the second half of 2020. TikTok has around 100 million users in the United States, up from 25 million in 2019.
Don’t worry guys, Biden’s on it.
Any parent who lets their kid use TikTok basically doesn’t care about their kid.