Facebook launches new rules aimed at stopping extremist content

Facebook announced new the rules around its live streaming platform that it says would have stopped a deranged gunman from using the service to broadcast a mass shooting at two New Zealand mosques in March.

The Palo Alto, Calif.-based social network said Wednesday that it is instituting a new “one strike” policy aimed at combating extremist content on Facebook Live by barring rule-breakers for extended periods of time.

The rule-change comes two months after a New Zealand man, who shot dead 50 people and wounded dozens more, broadcast his massacre on Facebook Live. Facebook was criticized for taking too long to remove the post.

The social media company says the shooter would not have been able to stream his attack on its platform had its new one-strike policy been in effect at the time.

But it stopped short of explaining why.

Facebook also largely failed to elaborate on what kind of conduct might constitute a violation of its policies, with the exception of one example about linking to a terrorist organization “with no context.”

“For instance, someone who shares a link to a statement from a terrorist group with no context will now be immediately blocked from using Live for a set period of time,” Guy Rosen, Facebook’s vice president of integrity said in a blog post.

What is known is that anyone who violates Facebook’s “most serious policies” will be banned from using Live “for set periods of time — for example 30 days — starting on their first offense,” according to Rosen’s post.

“Following the horrific terrorist attacks in New Zealand, we’ve been reviewing what more we can do to limit our services from being used to cause harm or spread hate,” Rosen said.

“We plan on extending these restrictions to other areas over the coming weeks, beginning with preventing those same people from creating ads on Facebook,” Rosen said.

Facebook will also invest $7.5 million in research “designed to improve image and video analysis technology” in order to more easily scrub offensive content from their platform, he said.

In the first 24 hours after the New Zealand mosque shootings aired, Facebook said it yanked at least 1.5 million clips of the carnage from its website. But video had already spread to other corners of the internet, forcing sites like YouTube and Reddit to scramble to keep footage of the bloodbath off of their sites.

Source: Read Full Article