YouTube expands with more human moderators and machine learning to prevent inappropriate content


YouTube’s CEO, Susan Wojcicki in a blog post today has announced that the company is expanding the teams that moderate the content that is on the platform as the company believes that human judgment is critical to making contextualized decisions on content. 

In the blog post, Susan stresses that since June the trust and safety teams have manually reviewed nearly 2 million videos for violent extremist content and also helped in training machine-learning technology to identify similar videos in the future. The company is now launching moderators for comments as well and in some cases shutting down comments altogether.

The company says that the teams work closely with NCMEC, the IWF, and other child safety organizations around the world to report predatory behavior and accounts to the correct law enforcement agencies. YouTube in the past few weeks said to have terminated hundreds of accounts and shut down hundreds of thousands of comments that it feels inappropriate to be in the community.

YouTube will use its cutting-edge machine learning technology to identify and remove content that violates the guidelines of YouTube quickly. Since June, the company has removed over 150,000 videos for violent extremism, and machine learning technology is directly helping human reviewers to remove nearly five times as many videos than they were previously and 98% of the videos that are removed are flagged by the machine learning technology.

The company says starting 2018 it will create a regular report where it will provide more aggregate data about the flags receive and the actions taken to remove videos and comments that violate YouTube content policies.

Talking about the same, Susan Wojcicki, CEO of YouTube in a blog post said:

We believe this requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for advertising. We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will also help vetted creators see more stability around their revenue. It’s important we get this right for both advertisers and creators, and over the next few weeks, we’ll be speaking with both to hone this approach.