Click to Skip Ad
Closing in...

Facebook admits that in a world with Donald Trump, it needs to get better at fake news

Published Nov 10th, 2016 5:34PM EST

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

In a scenario that was nothing more than a punchline even a few months ago, Donald Trump this January will be sworn in as the 45th President of the United States. Yes, the same Donald Trump who hosted a reality show on NBC and stepped into a WWE ring with Stone Cold Steve Austin will soon be the most powerful man in America.

DON’T MISS: Target Black Friday 2016 ad leaks: Huge iPhone 7, Xbox One S, TV, and other tech deals

Trump’s surprising victory has elicited a whole lot of outrage on the left, with many trying to take a step back and assess what went wrong. How is it possible, many have been wondering, that a man with no previous political or military experience, a man who is prone to insult any person or thing that stands in his way, was able to come out of nowhere and win the 2016 U.S. Presidential election?

Undoubtedly, political scientists will be studying the election gone by for decades on end, but for now, many on the left appear to be casting blame on Facebook for Trump’s unseemly and sobering rise to power.

In a widely circulated post from NY Mag, Max Read posits that Facebook in particular, and social networks in general, effectively handed the election over to Trump.

The most obvious way in which Facebook enabled a Trump victory has been its inability (or refusal) to address the problem of hoax or fake news. Fake news is not a problem unique to Facebook, but Facebook’s enormous audience, and the mechanisms of distribution on which the site relies — i.e., the emotionally charged activity of sharing, and the show-me-more-like-this feedback loop of the news feed algorithm — makes it the only site to support a genuinely lucrative market in which shady publishers arbitrage traffic by enticing people off of Facebook and onto ad-festooned websites, using stories that are alternately made up, incorrect, exaggerated beyond all relationship to truth, or all three.

It’s an interesting if not simplistic theory to ponder, particularly because the degree of power Facebook enjoys as it pertains to which stories in the news cycle pick up steam and which stories are quickly shut down is truly without equal.

Notably, Facebook’s problem with fake news became much more common once the company got rid of its human editors and relied upon ill-calibrated algorithms to determine which stories showed up in its feed of trending news stories.

All that said, Facebook at the very least recognizes the unique position it’s in and the power it wields and has promised to do a better job of ferreting out misleading and patently false stories from its feed.

In a statement provided to TechCrunch, the social networking giant said:

We take misinformation on Facebook very seriously. We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation. In Newsfeed we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution. In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation. We’re committed to continuing to work on this issue and improve the experiences on our platform.

Of course, how Facebook opts to address these concerns remains to be seen. From what we can gather thus far, the company has no plans as of yet to rely once again on human curators.

In a broader sense, I don’t think it’s clear-cut that fake stories had anything to do with Trump’s election victory. If anything, it’s also possible that the feedback loop of information Facebook inherently provides its users only served to strengthen the resolve of voters who were leaning in a particular political direction.

As Kim-Mai Culter astutely points out,

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.