产品中心

Facebook exec's defense of the 'drunk Pelosi' video doesn't add up

时间:2010-12-5 17:23:32  作者:关于我们   来源:新闻中心  查看:  评论:0
内容摘要:All the noise around the "drunk Pelosi" video has made something clear: Facebook wants to have it bo

All the noise around the "drunk Pelosi" video has made something clear: Facebook wants to have it both ways.

Some at the company would have us all believe that great strides have been made in the platform's ongoing fight against misleading "fake news" content. But Facebook also won't take the step of removing material like the doctored Pelosi clip, preferring instead to let users make their own choice about what to believe.

That's the takeaway from comments made by Monika Bickert, Head of Global Policy Management at Facebook, in an interview with CNN's Anderson Cooper.

"We think it's important for people to make their own, informed choice about what to believe," she said. Facebook works with independent fact-checking organizations to identify misleading content and flag it accordingly. So the company knows what's fake. It just won't take the step of removing such content.

That's what happened with a video of House Speaker Nancy Pelosi that was doctored to make it appear as if she was drunk or in some other way impaired. Fact-checkers flagged it as "false," a designation that earns the video a captioned warning and a reduced presence in News Feeds.

The video won't be deleted entirely because it doesn't violate community standards. Facebook will actively remove content that incites or promotes violence, which breaks the rules. The company has also shown a willingness to ban fake accounts and to de-platform problematic figures who repeatedly violate the site's rules.

"We think it's important for people to make their own, informed choice about what to believe."

In the CNN interview, Cooper asks Bickert again and again why Facebook wouldn't just delete content that's been flagged as false. She returns again and again to that idea of letting users decide on their own what to trust Misleading content is flagged as such, and that's supposed to be enough.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Is it, though? There's evidence that even flagged and downranked content can enjoy considerable reach on Facebook. Perhaps that's because certain individuals have worked hard to promote the message that the media as a whole is an enemy that shouldn't be trusted. People believe what they want to believe in the current climate, to the point that a "misleading content" tag could be read as a positive among certain readers and belief systems.

Cooper tries to address that in his chat with Bickert, pointing out that "the video is more powerful than whatever you're putting under the video." Bickert deflects, suggesting that the video is OK to keep around becausethe conversation around it has shifted to questions like the one Cooper posed.

"Well actually what we're seeing is that the conversation on Facebook, on Twitter, offline as well, is about this video having been manipulated," she said. "As evidenced by my appearance today, this is the conversation."

Later in the segment, Cooper presses Bickert on Facebook's responsibility to accuracy as a provider of news. She pushes back, pointing out that the company is a social media business, not a news business. When Cooper presses her yet again -- "you're sharing news ... because you make money from it," he argues -- Bickert draws a line between rules-violating violent content and political discourse.

SEE ALSO:Whistleblower says Facebook's algorithms generate extremist videos

"If it's misinformation that's related to safety, we can and we do remove it. And we work with safety groups to do that. But when we're talking about political discourse and the misinformation around that, we think the right approach is to let people make an informed choice," she said.

What a stunning point to make when we're barely a month removed from the release of the almost 500-page Mueller report, roughly half of which focuses on Russian efforts to influence the 2016 U.S. presidential election. As we now know, a big portion of those efforts involved exploiting social media platforms like Facebook.

It is a proven fact by now that political misinformation can have harmful effects. Proven beyond the shadow of any doubt. It's great to see Facebook taking action against the kinds of fake accounts that help to spread the bad stuff around, but it's only a half-measure. Plenty of real people are taken in by misinformation around the internet that they then share on social media.

This isn't the first time Facebook has leaned on policy to defend the presence of inappropriate content on the platform. But it's an increasingly hard pill to swallow as suspicions about the negative impact of disinformation on political discourse are proven correct again and again. Maybe it's time for Facebook to change the company line.


Featured Video For You
Mark Zuckerberg allegedly created a fake, racist social media profile in former colleague’s name
copyright © 2024 powered by Sina News Homepage   sitemap