Steve Kovach December 08, 2016 at 07:41AM
Sheryl Sandberg became the latest tech executive to play down the fake news problem on Thursday.
The Facebook COO was asked on Today whether or not fake news stories spread on the social network had an influence on the US election. Her answer was the same as her boss Mark Zuckerberg's: Nope.
But it's also the wrong question.
Of course executives at Facebook and Google are going to say they don't believe fake news on their platforms had any influence on the election, even as they promise to work on the problem. (That's been proven false, of course, as everyone from Pizzagate truthers to the president-elect have fallen for fake news stories.)
The better question for tech execs like Sandberg, Zuckerberg, and the rest is this: Do you think large distributors of news media, whether it's user generated or not, have a responsibility to vet that content for the truth?
It's a responsibility that the tech community doesn't appear to understand. I spoke with one high-level tech executive this week who told me the vast scale of content being posted online makes it nearly impossible to police for accuracy.
But while that argument makes sense on the surface, it falls flat when you consider that companies like Facebook and Google are able to filter out plenty of other types of content like porn and copyrighted materials from their platforms. They don't have to block people from posting conspiracy theories, but they should have the capability to make sure that content doesn't bubble to the surface and go viral.
It benefits these platforms to allow as much content as possible and deliver it to the people who want to see it. Otherwise, they risk alienating huge swaths of their audience. As CNN's Brian Stelter put it Wednesday at Business Insider's IGNITION conference this week, if people can't find the content that makes them feel good on Facebook, then Facebook risks losing them to some other site that will peddle that content.
So it's not a question of can fake news be tamed. It's a question of whether or not tech companies want to do it. Whether they want to admit it or not, distributing news comes with editorial decisions about what best serves the public.
With such a massive scale comes an equally massive responsibility. And I think we can all agree that that responsibility is to distribute the truth.
The opinions expressed in this article are those of the author.
SEE ALSO: And now for a reality check on the future of TV
Join the conversation about this story »
NOW WATCH: We sleep much differently than our ancestors — here's why
We're asking tech executives the wrong question about fake news (FB, GOOG) from Business Insider: Steve Kovach
No comments:
Post a Comment