YouTube continues its crackdown on videos that it thinks endanger the welfare of children by hiring around 10,000 new content moderators.
It can be recalled that in the previous weeks, Youtube started taking down videos and channels that, for some fans, subject children to inappropriate situations and environments.
On Monday, Google confirmed to The Guardian that they were going to add over 10,000 new employees to YouTube’s workforce as content moderators in 2018 to filter videos that could be violating the platform’s policies and guidelines.
In a blog post related to the news, YouTube CEO Susan Wojcicki said: “Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content.”
In the same post, Wojcicki reminisced on YouTube’s evolution and how it became an open platform for advocacies, mobilizations, and expositions of war crimes. However, the CEO admitted she has also “seen up-close” the “more troubling side of YouTube’s openness.”
Apart from filtering contents for the sake of the children, Wojcicki reiterated their goal of combating the spread of extremist and violent contents on YouTube. “Our goal is to stay one step ahead of bad actors, making it harder for policy-violating content to surface or remain on YouTube,” she said.
Meanwhile, though Google is hiring thousands of human moderators, Wojcicki’s post implied that they have plans to later employ digital reviewers to screen YouTube contents.
According to the YouTube chief, since they added more staff in June, they were able to review as many as two million videos that reportedly had “violent extremist content.”
While human moderators were attributed for the work, Wojcicki also stated that the manner by which real people screen videos are being used to “train our machine-learning technology to identify similar videos in the future.”
“98 percent of the videos we remove for violent extremism are flagged by our machine-learning algorithms,” Wojcicki added.