Facebook will use publisher consensus as metric for ‘non-fake’ news
Thu 26 Jan 2017

In the light of intense criticism over Facebook’s propensity to popularise ‘fake news’ – seen by many as a contributory factor to the outcome of the recent U.S. election – the social network has announced that it will soon begin to consider broad interest from major publishers as a critical index of whether a news story is valid or not – and whether it rises to the top in users’ feeds.
Will Cathcart, Facebook’s VP of Product Management, has posted that those all-important stories ‘bubbling up’ to the top of news will henceforth also contain an indication of where the story linked to was published – a practice that Google News has always adhered to; but most importantly, that stories which receive wide coverage among a locus of major news outlets will be favoured.
According to Cathcart, the attribution feature was the most requested in the wake of last August’s revisions to the Facebook news algorithms. Only three days after removing its human-led curation team last summer, Facebook was accused, not least by the Washington Post, of automatically propagating fake news stories as a result of using crowd-driven metrics to define news topics of interest. It seemed that the price of curbing clickbait was…well, anything resembling the truth.
‘The headline that appears,’ writes Cathcart, ‘is automatically selected based on a combination of factors including the engagement around the article on Facebook, the engagement around the publisher overall, and whether other articles are linking to it.’ Effectively this means the number of likes and shares around the specific post; how well-followed the publisher is on Facebook; and – least changed – the classic SEO metric of inbound links.
In December of 2016, in the face of severe criticism over fake news during election campaign season, Facebook announced a partnership with third-party fact-checkers such as Snopes and PolitiFact, among other measures to remedy the problem. But returning to a system which evaluates veracity on the basis of major publisher interest suggests a step back to rather more 20th-century standard methods for establishing truth in the media.
Commenting on the new (restored..?) value of publisher consensus of interest in a story, Cathcart writes:
‘Previously, topics may have trended due to high engagement on Facebook around a single post or article. With today’s update, we will now look at the number of publishers that are posting articles on Facebook about the same topic, and the engagement around that group of articles. This should surface trending topics quicker, be more effective at capturing a broader range of news and events from around the world and also help ensure that trending topics reflect real world events being covered by multiple news outlets.’
So, basically, a return to the pre-internet status quo of pre-assuming authority and lack of bias among a distinct and limited set of highly-organised and well-funded news outlets, whose demonstrable common interest in a story will, for Facebook, outweigh the potential influence of their own agendas and political philosophies.
It’s also a massive demotion for citizen-led journalism, based on the bad behaviour of irresponsible and cynical opportunists riding a promising, decentralising trend. But I guess this is why we can’t have nice things.