YouTube has announced they are making a move towards reducing the amount of hateful content it hosts. In a statement released today, the online video platform giant said that they are ‘specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion.’
YouTube has always been reluctant to reduce any form of racist, sexist or homophobic content, preferring to limit the spread of videos by not recommending them and not allowing advertisements on them. Currently objectionable videos would only be removed if they explicitly promoted violence.
The move also aimed to stop the spread of conspiracy theories and denialism, which have been widespread in the aftermath of many recent violent events, such as the Christchurch Terror Attack.
“We will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place.” the statement said. “We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future.”
However, many have seen the move by YouTube to be self-serving and inconsistent. While it claims that it will take several months for their systems to implement the anti-hate changes, it seems completely at odds with YouTube's ability to remove videos that breach copyright seemingly straight away.
Also, it comes during a week in which YouTube found alt-right vlogger Steven Crowder did not violate its terms by harassing Vox host Carlos Maza with racial and homophobic slurs during his recent videos. YouTube said while it found Crowder’s language “clearly hurtful,” it did not violate their policies.
"Our teams spent the last few days conducting an in-depth review of the videos flagged to us, and while we found language that was clearly hurtful, the videos as posted don’t violate our policies," YouTube said in a statement on Wednesday.