TikTok Allegedly Leads Child Accounts to Pornographic Content Within a Few Clicks
According to a new study, the widely-used social media app has been observed to steer profiles of minors to adult videos within a small number of clicks.
Research Methodology
A campaign organization set up test accounts using a date of birth for a minor and activated the platform's content restriction feature, which is meant to restrict exposure to adult-oriented content.
Researchers found that TikTok proposed sexually charged search terms to multiple test profiles that were established on unused smartphones with no previous activity.
Troubling Search Prompts
Keywords recommended under the "suggested searches" feature contained "extremely revealing clothing" and "explicit content featuring women" – and then escalated to keywords such as "hardcore pawn [sic] clips".
Regarding three of the accounts, the adult-oriented recommendations were recommended right away.
Quick Path to Pornography
Within minimal interaction, the investigators found adult videos ranging from revealing content to graphic sexual acts.
The organization reported that the content attempted to evade moderation, typically by displaying the clip within an harmless image or video.
In one instance, the method took two clicks after signing in: one tap on the search feature and then a second on the proposed query.
Compliance Requirements
The research entity, whose scope includes investigating digital platforms' effect on human rights, stated it carried out two batches of tests.
One set occurred before the enforcement of safeguarding regulations under the British online safety legislation on 25 July, and another following the regulations took effect.
Concerning Discoveries
The organization stated that several pieces of content included someone who looked like they were a minor and had been reported to the child protection organization, which oversees exploitative content.
The research organization asserted that TikTok was in violation of the Online Safety Act, which requires digital platforms to stop children from viewing harmful content such as explicit content.
Regulatory Response
A spokesperson for the UK communications regulator, which is responsible for regulating the legislation, said: "We value the work behind this research and will analyze its results."
Official requirements for following the act specify that online services that pose a significant danger of showing harmful content must "modify their programming" to filter out harmful content from minors' content streams.
The platform's rules prohibit adult videos.
TikTok's Statement
The video platform announced that upon receiving information from the research group, it had removed the problematic material and introduced modifications to its recommendation system.
"Immediately after notification" of these assertions, we responded quickly to investigate them, delete material that contravened our rules, and launch improvements to our recommendation system," stated a company representative.