After more than a week of accusations that the spread of fake news on Facebook may have affected the outcome of the presidential election, Mark Zuckerberg published a detailed post Friday night describing ways the company is considering dealing with the problem.
Zuckerberg, Facebook’s chairman and chief executive, broadly outlined some of the options he said the company’s News Feed team was looking into, including third-party verification services, better automated detection tools and simpler ways for users to flag suspicious content.
“The problems here are complex, both technically and philosophically,” Zuckerberg wrote. “We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible.”
The post was perhaps the most detailed glimpse into Zuckerberg’s thinking on the issue since Donald J Trump’s stunning defeat of Hillary Clinton in the November 8 election. Within hours of his victory being declared, Facebook was accused of affecting the election’s outcome by failing to stop bogus news stories, many of them favourable to Trump, from profilerating on the social network. Executives and employees at all levels of the company have since been debating its role and responsibilities.
Facebook initially tried to minimise concerns about the issue, with Zuckerberg calling the notion that the company swayed the election “a pretty crazy idea” at a technology conference last week. In a follow-up Facebook post, he said that less than one percent of news posted to Facebook was false.
But questions continued from outside the company, with some complaining that it was being too dismissive of its capacity to affect public opinion. In a news conference in Berlin on Thursday, President Obama decried the spread of misinformation on Facebook and other platforms.
Zuckerberg came to no conclusions in his post Friday, instead providing a list of possible solutions the company is exploring. One option, he said, could be attaching warnings to news articles shared on Facebook that have been flagged as false by reputable third parties or by Facebook users. Another could be making it more difficult for websites to make money from spreading misinformation on Facebook, he said.
Zuckerberg made it clear that Facebook would take care to avoid looking or acting like a media company, a label it has frequently resisted.
“We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content,” Zuckerberg wrote. “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.”
© 2016 The New York Times News Service