Internet giants grilled by MPs after they delete 30 million posts

Facebook, Google and Twitter are forced to remove 30million illegal and offensive posts in just three months – including 20million images of adult nudity

  • Facebook said it removed three million posts under hate speech rules 
  • MPs questioned how the firm would react to fake news during elections
  • Twitter deleted 1 million accounts which had posted terrorist content since 2015
  • e-mail

2

View
comments

Facebook, Google and Twitter have been forced to delete more than 30 million posts in just three months following reports of illegal and offensive content.

The trio were called before the Science and Technology Committee to show how they were tackling online pornography, fake news, hate speech and terrorist content.

UK Public Policy Manager for Facebook, Karim Palant, admitted the firm removed 20 million images of adult nudity in the first three months of the year.

He said three million posts were deleted under hate speech rules between January and March – 500,000 more than the previous quarter.

Facebook said it was increasing the number of moderators and staff working on safety, and promised to have 20,000 employees monitoring content globally by the end of the year.


Social media sites have had to up their game when it comes to the safety and security of its users (stock image)

Mr Palant said: ‘We wouldn’t be here if we weren’t concerned about the harms. There is clear illegal content that can be circulated and that is a huge priority, child sexual exploitation, terrorist material are a huge priority.

‘There is then a huge range of other kinds of upsetting and not necessarily clearly illegal content, such as bullying and harassment, hate speech. We have hugely increased the number of people we have employed to review that content.

‘We have algorithms to identify that content proactively. We removed three million in the first quarter of hate speech.’

  • Plague of ‘sneaky subscriptions’ on Apple’s App Store tricks… Snapchat unveils its first ‘Snap Originals’ shows to take on…
  • WhatsApp fixes flaw that let hackers take over apps when a…

Share this article

This is while Twitter’s Vice-President of Public Policy and Communications, Sinead McSweeney, also told the committee that the site had removed around one million accounts which had posted terrorist content since 2015, including 275,000 in the last six months of 2017.

Despite admitting the responsibility of these sites, Ms McSweeney said parents must also share some responsibility in keeping youngsters away from damaging material.

‘We put our hands up and said we hadn’t done enough in the early days. We have done a huge amount of work to improve our product and policies to ensure it is a safer place for people.

WHAT HAVE OTHERS SAID ABOUT FACEBOOK’S NEGATIVE IMPACTS ON ITS USERS?

Ex-Google and Facebook workers are campaigning to raise awareness of the negative effects of using products made by their former employers.

Among their concerns are addiction to technology and its impact on individuals, particularly children and younger users, as well as society as a whole.

Tristan Harris, a former in-house ethicist at Google is spearheading the new group, called the Center for Humane Technology.

The newly-launched initiative, which is working with the nonprofit media watchdog group Common Sense Media, is planning to lobby the United States government over tech addiction. 

It is also undertaking an advertising campaign aimed at 55,000 public schools in the US, to raise awareness with parents, students and teachers over its concerns.

These include the mental health effects of overuse of social media, including depression, stress, anxiety, self-image and self-worth, according to the group’s website.

The campaign, called The Truth About Tech, also seeks to address more wide-ranging problems caused by technology, including its power to influence our relationships and even our political beliefs.

Speaking to the New York Times Mr Harris, said: ‘We were on the inside. We know what the companies measure. We know how they talk, and we know how the engineering works.

‘The largest supercomputers in the world are inside of two companies — Google and Facebook — and where are we pointing them?

‘We’re pointing them at people’s brains, at children.’ 

In December 2017, former Facebook executive Chamath Palihapitiya also spoke out against the social network he helped to create, saying it is ‘ripping society apart’.

Mr Palihapitiya, who joined Facebook in 2007 and became its vice president for user growth, said he feels ‘tremendous guilt’ for the influence Facebook has had and its ability to manipulate users.

He also suggested users take a break from using social media altogether.

‘We will play our part by putting age limits and big signs saying this is not appropriate, but if people are allowing their children to sit in bedrooms with devices with nobody knocking on doors occasionally to say what’s going on, then…’

Elsehwre Claire Lilley, head of Child Safety, at Google UK, which owns YouTube, said that 7.7 million videos were removed from the video streaming site between April and June.

This may not seem like a lot in regards to the the size of the UK, but this figure was up from five million six months previously.


YouTube removed 7.7 million videos from its streaming site between April and June while it highlighted that it was set to take on more moderators

The Telegraph reported how Miss Lilley, highlighted how YouTube used machine learning algorithms to spot illegal and offensive content so it can be taken down quickly.

‘We receive hundreds of thousands of reports of inappropriate content every day and in quarter two we removed 7.7 million videos from YouTube.

‘We will have 10,000 moderators by the end of the year.

‘Of course sometimes things slip through our safeguards and they are exposed to content that is not for their consumption. We have very strict community guidelines in areas like the availability of sexual content, hateful and abusive content, and violent content. We don’t allow porn at all. Everything is scanned to make sure it is not child sexual abuse.

‘A lot of child sexual abuse imagery is created by young people themselves. Obviously it is incumbent to remove that content, but it is also about user behaviour and education of young people is so important.’

MPs also addressed whether the sites would be quick enough to remove ‘fake news’ videos in the event of an upcoming election.

Vicky Fox MP said: ‘If come the next election we will have deep fake videos impersonating each of us being popped up on YouTube will you be able to get them down before they go up?’

Miss Lilley said that there are particularly areas on YouTube where people need credible, authoritative information, like news, current affairs and politics.

‘We have a breaking news shelf where we prioritise content from an authoritative source and make sure people are getting recommended very good quality content from the BBC, The New York Times, The Daily Telegraph, whoever it might be to make sure people are not getting spammed content.’

Source: Read Full Article