Facebook founder Mark
Zuckerberg addresses the audience during a meeting of the APEC (Asia-Pacific
Economic Cooperation) Ceo Summit in Lima, Peru, on Nov. 19, 2016. Photo by
Mariana Baz/Reuters
After a week of accusations that fake news posts influenced
the outcome of the presidential race, Facebook founder Mark Zuckerberg
maintained Friday that, “the percentage of misinformation is relatively small,”
but outlined how he is working to mitigate it.
In a Facebook post, Zuckerberg mentioned prospects of stronger
detection, options for users to flag potentially fake stories and also a
warning system that would label them, as well as the possibility of raising the
standards for related articles that are suggested after each post. He described
the issue as complex because it puts him in a position to arbitrate freedom of
speech.
“We need to be careful not to discourage sharing of opinions
or mistakenly restricting accurate content. We do not want to be arbiters of
truth ourselves, but instead rely on our community and trusted third parties,” he wrote.
“While the percentage of misinformation is relatively small, we have much more
work ahead on our roadmap.”
His statement came after media outlets reported on allegations by critics that
fake news helped President-elect Donald Trump win the election. Initially,
Zuckerberg dismissed the concept as a “pretty crazy
idea.”
Then he said in a longer statement that more than 99 percent
of what people see is authentic, and that the other 1 percent is not always a
political hoax.
“Overall, this makes it extremely unlikely hoaxes changed the
outcome of this election in one direction or the other,” he wrote
in a Facebook post on Nov. 12.
But Facebook’s top executives have stilll been questioning
their role in the outcome.
And the power of fake news on Facebook is hard to gauge.
A Pew Research Center survey estimated that 44 percent
of the general population gets its news from Facebook, which it also stated is
by far the largest social networking site, reaching 67 percent of U.S. adults.
And an analysis by BuzzFeed found that false election stories outperformed
real news in engagement during the last three months leading up to the
election.
President Barack Obama said in Germany on Thursday that, “If
we are not serious about facts and what’s true and what’s not, and particularly
in an age of social media, where so many people are getting their information
in sound bites and snippets off their phones, if we can’t discriminate between
serious arguments and propaganda, then we have problems.”
A prolific fake news writer, who once made up news a story that Trump protesters
were paid by saboteurs, shared his perspective with the Washington Post.
“His campaign manager posted my story about a protester
getting paid $3,500 as fact. Like, I made that up. I posted a fake ad on
Craigslist,” he said. “I think
Trump is in the White House because of me.”
Zeynep Tufekci, an associate professor at the University of
North Carolina who studies the social impact of technology, also told the New York
Times, that there is no denying the influence of fake news.
“A fake story claiming Pope Francis — actually a refugee
advocate — endorsed Mr. Trump was shared almost a million times, likely visible
to tens of millions,” Tufekci said of a post on Facebook. “Its correction was barely heard. Of course
Facebook had significant influence in this last election’s outcome.”
Zuckerberg’s statement on Friday was the first to outline
potential steps toward helping the problem. He published it after he landed in
Lima, Peru for an Asia-Pacific Economic Cooperation summit. He gave a keynote
address there on Saturday, where he mostly talked about expanding access to the
internet. But he also touched on the fake news issue.
“We can work to give people a voice, but we also need to do
our part to stop the spread of hate, and violence, and misinformation,” he said.
You can read Zuckerberg’s full statement below.
“A lot of you have asked what we're doing about
misinformation, so I wanted to give an update.
The bottom line is: we take misinformation seriously. Our goal
is to connect people with the stories they find most meaningful, and we know
people want accurate information. We've been working on this problem for a long
time and we take this responsibility seriously. We've made significant
progress, but there is more work to be done.
Historically, we have relied on our community to help us
understand what is fake and what is not. Anyone on Facebook can report any link
as false, and we use signals from those reports along with a number of others
-- like people sharing links to myth-busting sites such as Snopes -- to
understand which stories we can confidently classify as misinformation. Similar
to clickbait, spam and scams, we penalize this content in News Feed so it's
much less likely to spread.
The problems here are complex, both technically and
philosophically. We believe in giving people a voice, which means erring on the
side of letting people share what they want whenever possible. We need to be
careful not to discourage sharing of opinions or to mistakenly restrict
accurate content. We do not want to be arbiters of truth ourselves, but instead
rely on our community and trusted third parties.
While the percentage of misinformation is relatively small, we
have much more work ahead on our roadmap. Normally we wouldn't share specifics
about our work in progress, but given the importance of these issues and the
amount of interest in this topic, I want to outline some of the projects we
already have underway:
- Stronger detection. The most important thing we can do is
improve our ability to classify misinformation. This means better technical
systems to detect what people will flag as false before they do it themselves.
- Easy reporting. Making it much easier for people to report
stories as fake will help us catch more misinformation faster.
- Third party verification. There are many respected fact
checking organizations and, while we have reached out to some, we plan to learn
from many more.
- Warnings. We are exploring labeling stories that have been
flagged as false by third parties or our community, and showing warnings when
people read or share them.
- Related articles quality. We are raising the bar for stories
that appear in related articles under links in News Feed.
- Disrupting fake news economics. A lot of misinformation is
driven by financially motivated spam. We're looking into disrupting the
economics with ads policies like the one we announced earlier this week, and
better ad farm detection.
- Listening. We will continue to work with journalists and
others in the news industry to get their input, in particular, to better
understand their fact checking systems and learn from them.
Some of these ideas will work well, and some will not. But I
want you to know that we have always taken this seriously, we understand how
important the issue is for our community and we are committed to getting this
right.”
Emoticon Emoticon