Last week, Facebook invited some media outlets to an event to hear what the company plans on doing about misinformation disseminated on its platform.
But many journalists, including CNN’s Oliver Darcy, were left dissatisfied with Facebook’s response.
Facebook invited me to an event today where the company aimed to tout its commitment to fighting fake news and misinformation.
I asked them why InfoWars is still allowed on the platform.
I didn’t get a good answer.https://t.co/WwLgqa6vQ4
— Oliver Darcy (@oliverdarcy) July 12, 2018
So why won’t Facebook ban sites that peddle obviously false information, like InfoWars?
In a Wednesday interview with Recode’s Kara Swisher, CEO Mark Zuckerberg said that Facebook draws a distinction between information that is objectively false and words that are meant to incite physical violence or “attack individuals” verbally.
“There are really two core principles at play here,” he said. “There’s giving people a voice, so that people can express their opinions. Then there’s keeping the community safe, which I think is really important. We’re not gonna let people plan violence or attack each other or do bad things. Within this, those principles have real trade-offs and real tug on each other. In this case, we feel like our responsibility is to prevent hoaxes from going viral and being widely distributed.”
Zuckerberg went on to explain that Facebook would examine sites that were flagged as “potential hoaxes”—in other words, limiting their spread across the site.
“Look, as abhorrent as some of this content can be, I do think that it gets down to this principle of giving people a voice,” he continued.
“Even if it’s a hoax?” Swisher continued.
“Yeah,” Zuckerberg said. “I mean, at some level, it’s hard to always have a clear line between… I’m not defending any specific content here. I think a lot of the content that’s at play is terrible. I think when you get into discussions around free speech, you’re often talking at the margins of content that is terrible and what should… but defending people’s right to say things even if they can be bad.”
He added that Facebook is “moving toward the policy of misinformation that is aimed at or going to induce violence, we are going to take down because that’s basically… The principles that we have on what we remove from the service are, if it’s going to result in real harm, real physical harm, or if you’re attacking individuals, then that content shouldn’t be on the platform. There’s a lot of categories of that that we can get into, but then there’s broad debate.”
But Swisher challenged Zuckerberg by noting that InfoWars’ repeated claims that the Sandy Hook shooting was staged are lies and “not a debate.” Zuckerberg agreed that “it is false.” However, if one Facebook user is going to attack a Sandy Hook victim or their family member, “that is harassment, and we actually will take that down.”
Zuckerberg said that Facebook would even allow Holocaust deniers “because I think there are things that different people get wrong. I don’t think that they’re getting it wrong.” He seemed to suggest that people who hold such viewpoints are somehow innocently misguided.
“It’s hard to impugn intent and to understand the intent,” Zuckerberg continued. “I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly. I’m sure you do. I’m sure a lot of leaders and public figures we respect do, too, and I just don’t think that it is the right thing to say, ‘We’re going to take someone off the platform if they get things wrong, even multiple times.’ What we will do is we’ll say, ‘OK, you have your page, and if you’re not trying to organize harm against someone or attacking someone, then you can put up that content on your page, even if people might disagree with it or find it offensive.’ But that doesn’t mean that we have a responsibility to make it widely distributed in News Feed.”
Swisher did not ask Zuckerberg what he thinks about Facebook profiting via ads from websites that traffic in obviously fake news.