TikTok removed 81.5m videos in Q2 for violating guidelines
18 October 2021 | 0
More thsn 81.5 million videos were removed from TikTok between April and June for violating its community guidelines or terms of service.
This is according to Tik Tok’s Q2 community guidelines enforcement report, which said the removed videos accounted for less than 1% of all videos uploaded on the platform.
Of those videos, TikTok said it identified and removed 93.0% within 24 hours of being posted, and 94.1% before a user reported them. It also said 87.5% of removed content had zero views, which is an improvement since TikTok’s last report (81.8%).
The report also highlighted improvements TikTok has made in its detection of hateful behaviour, bullying, and harassment. It said 73.3% of harassment and bullying videos were removed before any reports were made, compared to 66.2% in the first quarter this year, while 72.9% of hateful behaviour videos were removed before any reports compared to 67.3% from January to March. It attributed this progress to improvements its systems that proactively flag hate symbols, words, and other abuse signals for further review by our safety teams.
Harassment and hate speech can be challenging to detect and moderate, it said. For instance, reappropriation of a term is not a violation of TikTok’s policies, but using that reappropriated term to attack or abuse another person would violate TikTok’s hateful behaviour policy. Bullying can be highly personal and require offline context that isn’t always available.
To better enforce its policies, TikTok said it regularly trains and guides its team on how to differentiate between reappropriation and slurs or satire and bullying. TikTok has also hired policy experts in civil rights, equity, and inclusion. TikTok has encouraged people to report accounts or content that may be in violation of its Community Guidelines.
TikTok also added prompts to encourage people to consider the impact of their words before posting a potentially unkind or violative comment. It said this mechanism caused near four in ten people to withdraw and edit their comment.
The platform is set to improve its mute settings for comments and questions during livestreams, allowing a host to mute a viewer for a few seconds, minutes, or for the duration of the livestream.