Facebook blocks dozens of accounts set up to influence elections

Facebook blocks dozens of fake accounts set up to influence elections around the world by Israeli company that also spent over $800,000 on adverts

  • The Tel-Aviv based Archimedes Group provided election influencing as a service
  • Facebook blocked 65 accounts, 161 pages and 23 groups from their platform
  • The firm had also spent thousands of dollars buying Facebook advertisements
  • They targeted votes in Sub-Saharan Africa, Latin America and Southeast Asia

Facebook has terminated dozens of fake accounts, pages and groups secretly run by an Israeli firm that was coordinating efforts to influence elections across the globe. 

The Tel-Aviv based Archimedes Group used the accounts to spread fake news on the behalf of its clients, mainly targeting political groups in Sub-Saharan Africa.

In addition, the company had reportedly spent around £634,000 ($812,000) on Facebook advertisements since late 2012.

Scroll down for video 

Facebook has blocked dozens of fake accounts, pages and groups secretly run by an Israeli firm that was coordinating efforts to influence elections across the globe

Facebook’s head of cybersecurity policy, Nathaniel Gleicher, announced the firm’s detection of the influence campaign during a press conference on May 16, 2019.

The accounts were traced back by Facebook’s internal investigation team to a company known as the Archimedes Group.

In total, Facebook has removed 65 accounts, 161 pages and 23 groups being operated by Archimedes from their platform for violating misrepresentation policies.

In addition, four associated accounts were also banned from the Facebook-owned Instagram platform.

Mr Gleicher said that the Archimedes Group had also purchased around £634,000 ($812,000) in Facebook advertisements between December 2012 and April 2019, paid for in a combination of Brazilian reais, Israeli shekels and US dollars.

‘People behind the network used groups of fake accounts to run pages, disseminate content and to artificially increase engagement,’ Mr Gleicher told the Jerusalem Post. 

According to Mr Gleicher, the Tel Aviv-based firm principally posed as political candidates, using Facebook to disseminate fake news concerning elections.

The Archimedes website purports that the group are ‘leaders in large scale campaigns worldwide.’

Using ‘state-of-the-art technologies and innovative methods’, it continues, the company has taken ‘significant roles in many political and public campaigns, among them presidential elections and other social media projects all over the world.’

Archimedes’ stated aim is to ‘change reality according to our client’s wishes.’ 

The majority of the elections targeted by the Archimedes Group were in countries in Sub-Saharan Africa, he said, although there were additional efforts aimed at influencing votes in Latin America and Southeast Asia.

Mr Gleicher said that the Archimedes Group ‘represented themselves as locals.’

This included attempts to pass themselves off as various local news organisations, which published supposedly leaked information concerning political targets 

‘Pages would frequently post about political news, including topics like elections, candidate views and criticisms of candidates’ opponents,’ Mr Gleicher added.

Before they were deleted by Facebook, the pages had reportedly accumulated around 2.8 million followers and racked up hundreds of thousands of views. 

‘Our team has assessed that this group is primarily organised to conduct this kind of deceptive behaviour,’ Mr Gleicher said.

Facebook has removed Archimedes’ presence and is ‘blocking them from coming back,’ he added.

‘That type of business doesn’t have a place on our platform.’

Facebook’s head of cybersecurity policy, Nathaniel Gleicher (pictured, in Facebook’s Menlo Park, California, headquarters on October 17, 2018), announced the firm’s detection of the influence campaign during a press conference on May 16, 2019

The overall intent behind the Archimedes Group’s widespread influence campaign is not entirely clear.

However, Mr Gleicher said, the company was commercially engaged, ‘appeared to work on behalf of public figures and political figures, working to push positive narratives about them and to push criticism of their political opponents.’

Facebook has been facing increasing pressure to transparently address misinformation campaigns ever since it was revealed that the Russian authorities used the social media platform to influence the 2016 US presidential election.

The announcement of the identification of Archimedes’ influence campaigns comes after a difficult week for Facebook, with revelations that the firm’s WhatsApp messaging service contains a major vulnerability to external hacking.


In 2016, following the shock November 2016 US election results, Mark Zuckerberg claimed: ‘Of all the content on Facebook, more than 99 per cent of what people see is authentic’.

He also cautioned that the company should not rush into fact-checking. 

But Zuckerberg soon came under fire after it emerged fake news had helped sway the election results.

In response, the company rolled out a ‘Disputed’ flagging system that it announced in a December 2016 post. 

The system meant users were responsible for flagging items that they believed were fake, rather than the company.

In April 2017, Facebook suggested the system had been a success. 

It said that ‘overall false news has decreased on Facebook’ – but did not provide any proof.

‘It’s hard for us to measure because we can’t read everything that gets posted’, it said. 

But it soon emerged that Facebook was not providing the full story. 

In July 2017, Oxford researchers found that ‘computational propaganda is one of the most powerful tools against democracy,’ and Facebook was playing a major role in spreading fake news.

In response, Facebook said it would ban pages that post hoax stories from being allowed to advertise in August 2017.

In September, Facebook finally admitted during congressional questioning that a Russian propaganda mill had placed adverts on Facebook to sway voters around the 2016 campaign.

In December 2017, Facebook admitted that its flagging system for fake news was a failure.

Since then, it has used third-party fact-checkers to identify hoaxes, and then given such stories less prominence in the Facebook News Feed when people share links to them.

In January, Zuckerberg said Facebook would prioritise ‘trustworthy’ news by using member surveys to identify high-quality outlets.

Facebook has now quietly begun ‘fact-checking’ photos and videos to reduce fake news stories. However, the details of how it is doing this remain unclear. 


Source: Read Full Article