TikTok will begin automating nudity video deletions and more in the US



 TikTok is attempting to expedite its moderation process by eliminating human reviewers from areas where its automated technologies can perform the same functions as human reviewers.

Companies in the United States and Canada will soon begin utilizing automated screening systems to filter out films that include nudity, sex, violence, violent material, illegal activity, and breaches of the company's child safety policy, among other things. As soon as the algorithm detects a video that falls into one of those categories, it will be removed, and the author will be given the option of appealing the decision to an actual human moderator.

According to a representative for TikTok, all videos in the United States had already been reviewed by human moderators before being removed. A major goal of the move is to reduce the "number of upsetting films" that moderators are forced to watch, allowing them to spend more time on harder pieces, such as disinformation, that need context in order to be properly reviewed. Moderators for other businesses, such as Facebook, have reported developing symptoms of post-traumatic stress disorder (PTSD) as a result of the films they were forced to view. It has also been claimed that the revelation is part of an attempt to increase openness regarding moderation, according to Axios, which was the first to break the story.

The issue that TikTok will have to deal with is that automated methods are never completely reliable, and certain communities may be particularly severely impacted by mistakes in automatic takedowns. Discriminatory moderation has been a problem for the service in the past, and it was recently chastised for removing the intersex hashtag from the app twice. TikTok claims that it is just beginning to employ automation in areas where it is most trustworthy; the company has been testing the technology in other countries, including Brazil and Pakistan, and claims that only 5 percent of the films that were deleted by its algorithms should have been allowed to remain online.

However, when taken into consideration the sheer volume of films TikTok removes — 8,540,088 videos in the United States during the first three months of 2021 — it is possible that hundreds of thousands or even tens of thousands of videos are accidentally removed by accident. Most of the videos deleted are from categories in which TikTok is adopting automatic moderation, as seen in the chart below. Having said that, not all of those films will be routed through human reviewers, as previously stated. According to a spokesman, human moderators will continue to review community complaints, appeals, and other videos that are identified by the company's automatic algorithms.

It is expected that the automatic review mechanism would be implemented "over the following several weeks," according to TikTok.

Post a Comment

Previous Post Next Post