YouTube is announcing a new step in its campaign to rid the site of offensive material. In addition to banning videos that contain content which violates YouTube’s user agreement, the site will also place some videos in a “limited state” which is designed to prevent them from receiving attention. YouTube announced the change on their official blog today:
We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter.
It seems the goal is to have most of this controversial content determined by “machine learning” guided by YouTube’s expert partners:
With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we’ve removed for violent extremism, as well as the rate at which we’ve taken this kind of content down…
Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists.
What videos will be put in a “limited state” and what are the guidelines that render something “controversial,” YouTube doesn’t say. Based on it’s partnership with the No Hate Speech Movement, the answer could be literally anything anyone finds offensive. Don’t be surprised to discover that, under the new rules, lots of conservative content which is neither violent not hateful in tone will suddenly be judged controversial and disappeared.
Join the conversation as a VIP Member