You are using an older browser version. Please use a supported version for the best MSN experience.

Facebook denies filtering conservative news stories

Engadget Engadget 10/05/2016 Timothy J. Seppala
© Provided by Engadget

Even if your Facebook News Feed is full of family members dropping racist memes or links to factually inaccurate articles, you might not see such showing up in the "trending news" portion of the social network's landing page. And there's a reason for that: Workers "routinely suppressed" news stories that'd interest conservative users from the section, according to a report from Gizmodo. Those stories apparently include anything about the Conservative Political Action Conference, two-time Republican presidential hopeful Mitt Romney and posts from conservative news outlet The Drudge Report.

More than that, it appears Facebook wouldn't curate a story with conservative origin (Breitbart, for example) unless it was picked up by The New York Times or BBC first. While Facebook's company line is that it "takes allegations of bias very seriously" in light of the Gizmodo report, claiming "rigorous guidelines" to ensure consistency and neutrality and that those guidelines don't "permit the suppression of political perspectives," the sources for these allegations were contract workers -- not full-on employees themselves. These contractractors worked for Facebook from the middle of 2014 until December 2015.

What appears in the Trending News module isn't exclusively determined by an algorithm of what its users are actively sharing, it's curated much like how an editorial newsroom operates. One of Gizmodo's sources -- who leans politically conservative -- says that what would populate the list was largely determined by who was working at the time. If that person happened to not subscribe to conservative points of view, a story would be blacklisted. More than that, if a particular story is trending on Twitter but not Facebook? It's "injected" into the Trending News section. Specific instances of that include the Black Lives Matter conversation or the ongoing conflict in Syria.

This isn't the first time Facebook has come under fire for this type of thing. In 2014 the company admitted that it controversially, and experimentally, altered the News Feed to measure your emotional responses.


More from Engadget

image beaconimage beaconimage beacon