Google announced it has removed 8.3 million videos from YouTube in its Community Guidelines enforcement report. If the latter bit sounds alien, that’s natural since this is the first time the search giant has come up with such a report.
Being the biggest video hosting site in the world, it is no mean job to keep a tab on each video posted on YouTube. No wonder, there always is no dearth of content that someone has issues with. That makes policing the site a huge task though it is good to see Google rising to the challenge.
Google also said the entire operation of keeping a watch on all videos isn’t entirely manned by a human force. Instead, it has bots to ensure videos comply with the YouTube community guidelines. That guarantees videos depicting explicitly sexual content or gross violence, spam or misleading content and such don’t get a berth at the video-sharing site.
Also, to reassure the bots are doing a fine job, Google said 6.7 million of the 8.3 million video that it removed have been flagged by the bots themselves. And the best thing is, that has been done even before the videos went on to score even a single view. The above figures apply to the period from October and December 2017.
Apart from the bots, Google also relies on a dedicated human workforce tasked to filter out videos that violate its content publishing guidelines. While that is a nice move as humans perhaps can never be replaced entirely by bots to discover inappropriate videos, that also makes for an immensely stressful job viewing the disturbing content.
Meanwhile, not all of Google’s efforts at establishing a clean environment at YouTube had gone off well. Rather, the company has often been pulled up for removing a video wrongfully or taking too long for removing a heavily flagged video. Then there were also instances of it search suggestions suggesting videos it ought not.