Facebook bans ‘deepfake’ videos as 2020 election looms

Facebook has banned “deepfake” videos manipulated by artificial intelligence amid worries that they could throw a wrench into the 2020 presidential election.

The social media giant will remove authentic-looking videos made with AI that have been edited to make viewers think the subject said something they didn’t actually say under a new policy announced late Monday.

“While these videos are still rare on the internet, they present a significant challenge for our industry and society as their use increases,” Monika Bickert, Facebook’s vice president of global policy management, wrote in a blog post outlining the change.

But Facebook said the new rules won’t apply to satire or parody videos, or those that have been tweaked only to “omit or change the order of words.”

Facebook came under fire last year for refusing to remove a doctored video of House Speaker Nancy Pelosi that made her sound drunk by distorting her speech. Instagram has similarly left up a fake video of Facebook CEO Mark Zuckerberg bragging about his “total control” of “stolen data.”

Sketchy videos that don’t meet Facebook’s standards for removal can still be flagged as false or partially false by one of Facebook’s outside fact-checkers. That designation will limit the content’s distribution in the News Feed and prevent it from being run as an advertisement, according to Bickert.

Cybersecurity experts have raised concerns that deepfakes could sway voters in this year’s presidential election as the technology used to make them has advanced quickly.

Twitter also proposed restrictions for deepfake videos in November. The company sought feedback on changes that may label tweets that share “synthetic or manipulated media” and remove such tweets if they were misleading and could endanger physical safety.

With Post wires

Source: Read Full Article