Social media giant set to roll out new system that will bury more news content
Posts from a private Facebook forum obtained by Infowars feature a Facebook employee assuring another user that Infowars and Breitbart won’t be able to influence the company’s new ‘trust’ algorithm to make their content appear more regularly in people’s Facebook feed.
Facebook recently announced plans to revert back to a system where users would be more likely to see content posted by their close friends and family instead of breaking news.
“The site is also requesting data from users to discover which news sources people trust and want to see posts from,” reported the Metro.
As part of this process of deciding which sources are trustworthy, Facebook is obviously keen to ensure that the likes of Infowars and Breitbart don’t perform well, if chat logs from a closed group on Facebook are anything to go by.
The conversation, which was sent to Infowars by a member of the News, Media & Publishing on Facebook private group, shows one user asking a Facebook employee how the company will prevent followers of Infowars and Breitbart from organizing to have the two sites treated by Facebook as credible sources.
“Let’s assume those players launch a campaign to mark their sites as credible, Facebook should be easily be able to see if these voters are highly partisan already,” asks Matthew Karolian. “They should be able to see the amount of time that user spends on sites like Infowars, Breitbart, RT, etc, by using Facebook pixel data. It seems like a fairly trivial task to rank those votes accordingly,” he adds, making the suggestion that people who declare Infowars and Breitbart to be credible sources should be punished by the algorithm by having their votes count for less.
Facebook employee Jason White responded: “If we had a URL sitting out there where people could vote, then that would be very easy to game indeed. But because we’re randomly sampling people on an ongoing basis, the risks of gaming go down. Also, the people most likely to be influenced by an infowars campaign to mark their site as trustworthy (e.g. their fans on facebook or twitter or any other place), probably already would mark infowars as credible. It’s neither easy nor cheap to change public opinion broadly.”
Whether Facebook truly is sampling random people to determine the credibility of news sources, or whether it is in fact targeting people with left-wing sympathies who are more likely to downvote sites like Infowars and Breitbart remains to be seen.
Given that Facebook was caught gaming its own system in order to bury conservative news content from appearing on the site’s ‘trending’ section, it would be somewhat naive to believe that their new system will be a level playing field.
In 2016, it was revealed that Facebook “routinely” suppressed conservative news content to suppress its ability to go viral.
Earlier this month, Project Veritas also revealed, via secret recordings of current and former employees, that Twitter was also deliberately targeting conservatives for censorship.