The Chinese social media giant TikTok should be banned, after a investigation found that paedophiles caught sexually grooming children on the site were ‘let off’ after receiving one-week suspensions and details of their illegal activities are rarely passed onto police
TikTok has become a ‘hunting ground’ for paedophiles as it emerged as a viral hit with children, who use the app for short videos set to music to post dance challenges and prank clips
TikTok – one of the world’s most downloaded apps – is exposing children as young as five to leering sex pests, cruel taunts and the glorifying of anorexia and self-harm.
The problem is quite simple. TikTok’s minimum age is 13, but they do nothing during the sign-up process to check – meaning huge numbers of children join below the required age.
It also has the ability to live-stream, making it even more of a threat.
If a child decides to go public they are inviting people to follow them and message them.
TikTok’s settings even allow your videos to be promoted “to users interested in accounts like yours”. That’s where the danger begins.
Paedophiles look for videos of children. Likely they will pretending to be children themselves.
They will target someone, follow them and set about trying to lure and trap the innocent. The results can be devastating.
That exposes youngsters to suggestions from strangers in real-time, with just seconds to decide on a response that risks leading to a spiral of abuse and exploitation.
Leaked documents, show the tech company’s policy for users caught messaging children in a sexualised way was to lock down their account for seven days for their first offence, a month for the second and then impose an automatic permanent ban when they had been reported for a third time.
Former moderators at TikTok, which has exploded in popularity and is now used by over half of eight to 15-year-olds according to Ofcom, this meant they saw paedophiles allowed back onto the site with their same accounts, allowing them to continue grooming attempts.
TikTok said the week-long suspension policy was a ‘holding pattern’ to buy time while its dedicated child protection teams in China, and later Dublin, investigated suspected grooming, with any confirmed cases resulting in an immediate permanent ban.
Former moderators, who joined the company’s London-based offices as it rapidly grew in popularity in the UK in 2018 and 2019, described how they were told to prioritise monitoring up to 1,000 videos a day each, over regularly checking private messages that had been reported.
One ex-moderator said that while they had to view any flagged videos within 15 minutes, reported private messages would sometimes not be seen by a human for days, with the backlog of unread messages sometimes reaching over 1,000. One ex-employee described the private message queue as being seen as the ‘least important’ of the more than 15 other video queues moderators had to work through every day
TikTok says it now has the ability to ban users at device level, but even when paedophiles were banned permanently, former moderators said they saw them return to the site by setting up new accounts
Meanwhile, moderators said they were told to prioritise monitoring videos, including for minor infractions of th companys guideline, such as whether users were showing excessive cleavage pr swinging their hips too much while dancing.
After working for TikTok, one former moderator said they had warned relatives not to let their children use the app
They said: “I wouldn’t feel safe letting someone go on there and trusting the report functions if they felt threatened, because the punishments don’t work.”
Another former moderator added: “Parents should forbid their children from using it. If they do use it, they should be really careful and check everything, every day.”