Facebook staff call to keep 'news censorship' measures in place

Facebook employees want to keep in place news censorship measures to tackle the spread of ‘misinformation’ following the US election – even if they make the platform more boring

  • Facebook added new ‘news censorship measures’ at the end of the US election 
  • They were designed to slow the spread of misinformation and ‘fake news’
  • They involved adding weight to stories published by mainstream publications 
  • Facebook staff have called for these measures to remain in place going forward 

Facebook staff want to keep in place news censorship measures that were brought in to stop the spread of ‘misinformation’ following the US election – even if it makes the social network less engaging, the New York Times reports.

In the immediate aftermath of the contested US election, Facebook implemented a change to its news feed algorithm that favoured large general news publishers.

This meant that sites like CNN, the BBC, New York Times and NPR saw a spike in views, whereas more partisan sites like Breitbart and Occupy Democrats took a hit.

The post-election changes were designed to be temporary, but in a series of ‘heated internal debates’ seen by the New York Times, staff argued for the measures to remain in place – even if they resulted in people spending less time on the site. 

In the immediate aftermath of the contested US election, Facebook implemented a change to its news feed algorithm that favoured large general news publishers 

In the days following the election, Facebook staff presented CEO Mark Zuckerberg with evidence that misinformation was going viral on his website.

To tackle this, the team made a change to the news feed algorithm that gave more weight to sites ranked highly by its own internal ‘news ecosystem quality’ (NEQ) score.  

Very little is known about this internal ranking – beyond the fact it is assigned to all news publishers based on signals surrounding the quality of their journalism.

According to the New York Times report, this score plays a major role in what users can see on their news feed – and that role increased in importance after the changes made by Facebook in the aftermath of the election.

The NEQ weighting was supposed to revert to normal following the ‘tense election period’, but staff reportedly said on internal discussion boards that the change has made for a ‘nicer news feed’. 

Facebook executive, Guy Rosen, in charge of cleaning up the platform, told reporters during a call that there were never plans to make changes permanent. 

The company said it was an ‘experiment’ and would roll back to the way it was before, but that it would study and learn from the findings. No date has been given for the roll back.

‘There are tensions in virtually every product decision we make,’ Facebook spokesman Joe Osborne told the New York Times.

The incident highlights some of the ongoing battles between staff and executives, who have clashed over whether the platform should put quality of content over growth.

Osborne said that Facebook had ‘developed a companywide framework called “Better Decisions” to ensure we make our decisions accurately, and that our goals are directly connected to delivering the best possible experiences for people.’

Facebook employees have been taking to internal discussion forums while working from home to express anger and discontent at some of the decisions by executives.

This includes Zuckerberg’s decision that a post by President Donald Trump, saying ‘when the looting starts, the shooting starts’ during the Black Lives Matters protests, didn’t break the rules.

In the days following the election, Facebook staff presented CEO Mark Zuckerberg with evidence that misinformation was going viral on his website

The post-election changes were designed to be temporary, but in a series of ‘heated internal debates’ seen by the New York Times, staff argued for the measures to remain in place – even if they resulted in people spending less time on the site 

It was one in a number of spats between the staff and executives, the New York Times discovered, including over how to tackle misinformation.

Staff reported that there was often a trade-off between tackling the issue and not angering powerful partisans or damaging Facebook’s growth.  

In one experiment, data scientists and engineers at the social media giant developed a system that would rank a post ‘good for the world’ or ‘bad for the world’ and found the majority of high-reach posts seen by many users were ‘bad for the world’.

They used AI to automatically demote posts in the news feed deemed ‘bad for the world’, but it led to a drop in the number of times users opened Facebook. 

‘The results were good except that it led to a decrease in sessions, which motivated us to try a different approach,’ according to a review seen by the New York Times. 

Eventually a change was approved by executives using the ‘bad for the world’ system but on a lower sensitivity setting – so more posts were still visible.

Staff reported that there was often a trade-off between tackling the issue of misinformation and not angering powerful partisans or damaging Facebook’s growth

Another tool – designed to label a story as ‘fake news’ if branded as such by third party fact checking services – was rejected by executives as it would primarily affect right-wing publishers, according to sources familiar with the decision.

A Facebook executive said the people who spoke to The New York Times have no decision-making abilities, and that the company regularly reviews options. 

‘The question is, what have they learned from this election that should inform their policies in the future?’ Vanita Gupta, the chief executive of the civil rights group Leadership Conference on Civil and Human Rights told New York Times. 

‘My worry is that they’ll revert all of these changes despite the fact that the conditions that brought them forward are still with us.’    

Source: Read Full Article