YouTube today expanded its hate-speech policy to ban more white supremacist videos, such as those that promote Nazi ideology. The site is also banning hoax videos that deny the existence of the Holocaust and other well-documented violent events.
The move will likely result in bans for many white supremacist YouTubers and other people spreading hateful ideologies.
“Today, we’re taking another step in our hate-speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation, or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation, or veteran status,” YouTube’s announcement said. “This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory. Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place.”
YouTube, which is owned by Alphabet subsidiary Google and has been under pressure to ban more offensive content, said it will begin enforcing the updated policy today. “However, it will take time for our systems to fully ramp up, and we’ll be gradually expanding coverage over the next several months,” YouTube said.
Some of the videos targeted by YouTube’s new policy do have “value to researchers and NGOs looking to understand hate in order to combat it,” the company said. Because of that, YouTube said it is “exploring options” to make banned videos available to researchers and NGOs in the future.
“[A]s always, context matters, so some videos could remain up because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events,” YouTube said.
“YouTube did not name any specific channels or videos that would be banned,” The New York Times noted. “But on Wednesday, numerous far-right creators began complaining that their videos had been deleted or had been stripped of ads, presumably a result of the new policy.” Thousands of videos are expected to be removed.
YouTube said it will also try to reduce the spread of what it calls “borderline content.” In January, YouTube “piloted an update of our systems in the US to limit recommendations of borderline content and harmful misinformation, such as videos promoting a phony miracle cure for a serious illness or claiming the Earth is flat,” the company said. YouTube said today that it plans to deploy that system in more countries later this year.
YouTube last year started displaying Wikipedia links and other information alongside videos that spread conspiracy theories. The effort to recommend more accurate information will expand, too, YouTube said today.
“[I]f a user is watching a video that comes close to violating our policies, our systems may include more videos from authoritative sources (like top news channels) in the ‘watch next’ panel,” YouTube said.
YouTube employees pushed execs to take action
A Bloomberg report in April said that in recent years, “scores of people inside YouTube and Google… raised concerns about the mass of false, incendiary, and toxic content that the world’s largest video site surfaced and spread.” Employees who talked to Bloomberg on condition of anonymity were frustrated that executives didn’t do more to rid YouTube of false and hateful content.
“The company spent years chasing one business goal above others: ‘Engagement,’ a measure of the views, time spent and interactions with online videos,” Bloomberg wrote. “Conversations with over 20 people who work at, or recently left, YouTube reveal a corporate leadership unable or unwilling to act on these internal alarms for fear of throttling engagement.”
YouTube’s algorithms for deciding which content to promote or de-emphasize have some problems. YouTube lost some advertisers in February because of ads appearing alongside videos shared by pedophiles. YouTube responded by banning many channels and accounts and by disabling comments on tens of millions of videos.
“However, non-predatory videos got caught up in the wave, sparking outrage from large creators and ‘mom and dad’ vloggers who primarily post videos including their children,” we wrote at the time.
YouTube also lost some advertising in 2017 because of the spread of extremist content on the site.
YouTube faced more criticism in the past few days because it refused to ban Steven Crowder, a YouTuber who has repeatedly made homophobic jokes about a Vox writer named Carlos Maza.
“I’ve been called an anchor baby, a lispy queer, a Mexican, etc.,” Maza wrote on Twitter. “These videos get millions of views on YouTube. Every time one gets posted, I wake up to a wall of homophobic/racist abuse on Instagram and Twitter.”
YouTube defended its decision, telling USA Today this week that “Crowder has not instructed his viewers to harass Maza on YouTube or any other platform.”
“Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site,” YouTube said in a tweet yesterday.
YouTube partially reversed itself later, saying today that it has “suspended this channel’s monetization… because a pattern of egregious actions has harmed the broader community and is against our YouTube Partner Program policies.” The Steven Crowder channel is still on YouTube and has 3.8 million subscribers, though.
Maza accused YouTube of “arming the monsters that we’ve spent our lives trying to get away from.”
In its announcement today about its updated hate-speech policy, YouTube said it is trying to maintain a balance between having an open platform and preventing hatred and harassment.
“The openness of YouTube’s platform has helped creativity and access to information thrive,” YouTube said. “It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination, and violence.”