Defying pollsters and political pundits across the country, Donald Trump handily won the 2016 U.S. Presidential election and will be sworn in as the 45th President of the United States in just about two months. Trump’s surprising victory over Hillary Clinton was met with utter shock and disbelief in liberal circles, prompting many to wonder how a candidate with zero political experience and a penchant for making inflammatory remarks managed to inspire and capture the attention of so many Americans.
Through this process, an increasingly popular narrative holds that many Americans were swayed, if not downright duped, into voting for Trump on account of fake stories on Facebook. As the argument goes, Facebook has done a horrible job of identifying and removing fake stories from its feed (especially anti-Hillary Clinton stories), thereby exposing the site’s vast user base to what ultimately amounts to propaganda.
When this issue was first thrust into the spotlight last week, Facebook initially took a diplomatic if not conciliatory tone, admitting that it needs to do a better job of removing fake news stories. Since then, however, Facebook CEO Mark Zuckerberg has been much more defensive about Facebook’s role in the election, going so far as to say that the notion that Facebook influenced the election in favor of Trump was “crazy.”
Over the weekend, Zuckerberg posted a lengthy Facebook note on the topic. While admitting that he doesn’t want Facebook to house internet hoaxes masquerading as legitimate news stories, the Facebook CEO said that more than 99% of all news stories users see on the site are “authentic.” What’s more, Zuckerberg explained that the distribution of fake news stories is about even across partisan lines.
The full post can be read in its entirety below:
I want to share some thoughts on Facebook and the election.
Our goal is to give every person a voice. We believe deeply in people. Assuming that people understand what is important in their lives and that they can express those views has driven not only our community, but democracy overall. Sometimes when people use their voice though, they say things that seem wrong and they support people you disagree with.
After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right. I want to do my best to explain what we know here.
Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.
That said, we don’t want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further.
This is an area where I believe we must proceed very carefully though. Identifying the “truth” is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
As we continue our research, we are committed to always updating you on how News Feed evolves. We hope to have more to share soon, although this work often takes longer than we’d like in order to confirm changes we make won’t introduce unintended side effects or bias into the system. If you’re interested in following our updates, I encourage you to follow our News Feed FYI here: http://bit.ly/2frNWo2.
Overall, I am proud of our role giving people a voice in this election. We helped more than 2 million people register to vote, and based on our estimates we got a similar number of people to vote who might have stayed home otherwise. We helped millions of people connect with candidates so they could hear from them directly and be better informed. Most importantly, we gave tens of millions of people tools to share billions of posts and reactions about this election. A lot of that dialog may not have happened without Facebook.
This has been a historic election and it has been very painful for many people. Still, I think it’s important to try to understand the perspective of people on the other side. In my experience, people are good, and even if you may not feel that way today, believing in people leads to better results over the long term.
It’s an interesting take, but there’s more to Zuckerberg’s claims than meets the eye. In a rebuttal post put together by Mike Caulfield, Caulfield effectively claims that cold hard data doesn’t support Zuckerberg’s position. If you’re at all interested in this ongoing debate, the rebuttal post can be read over here.
On a related note, it stands to reason that most individuals prone to believing a hyperbolic news story that skews to an extreme partisan position likely already have their minds made up. Arguably, Facebook in this instance isn’t so much influencing the voting patterns of Americans as it is bringing a prime manifestation of confirmation bias to the surface.