Facebook will now rank news sources based on how 'trustworthy' readers think they are

Facebook is going to start distributing news sources by how “trustworthy” its users think they are — a huge change as the social media giant continues to come under fire over the spread of misinformation on its platform.

Make no mistake, they are going to control the narrative, snuff out Conservatives, and that will be that. Our sister site Project Republic currently has ZERO reach when any post is made on Facebook. Tell me again how much you support net neutrality when you pull this?

On Friday, the California giant announced in a blog post that it is modifying the algorithm for picking news to show in its News Feed to prioritize three factors: Whether the news is considered “trustworthy,” whether it is “informative,” and whether it is “relevant to people’s local community.”

Facebook won’t be judging the trustworthiness of news outlets itself. Instead, News Feed head Adam Mosseri said that users are being polled on what outlets they believe trustworthy or otherwise, and that data will be used to rank outlets.

“We surveyed a diverse and representative sample of people using Facebook across the US to gauge their familiarity with, and trust in, various different sources of news. This data will help to inform ranking in News Feed,” he wrote.

But there are already concerns that this could prioritize partisan sources of information. A right-wing user polled might consider CNN extremely untrustworthy but rate a right-wing blog far more highly — even if CNN is, in reality, a more accurate source of information about current affairs.

Earlier in 2018, Facebook announced it would be making major changes to the News Feed to prioritize updates from friends and family, while de-emphasizing news and brands, to try and foster what CEO Mark Zuckerberg described as “meaningful interaction.”

The changes came after Facebook — and the broader tech industry — came under a barrage of criticism over its impact on society, from its role in spreading Russian propaganda and misinformation during the 2016 US presidential election to its impact on children’s’ mental health.

“We feel a responsibility to make sure our services aren’t just fun to use, but also good for people’s well-being,” Zuckerberg wrote in a blog post earlier in January 2018.

On Friday, the chief exec also published a post outlining the change to prioritize “trustworthiness,” and the rationale behind it. Facebook said it isn’t “comfortable” assessing the trustworthiness of news outlets itself and asking outside experts wouldn’t “objective.” So it views community feedback as the most suitable method.

“The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking,” he wrote.

“We decided that having the community determine which sources are broadly trusted would be most objective.”

And to think that hundreds of conservative news sites — including this one — spent money on Facebook advertisements.

See:

Text Example

Share this article on Social Media by clicking the share buttons on screen, support our independent journalism! Get the word out!