Report warns AI can’t offer protection from ‘deepfakes’

Artificial intelligence-based solutions may not be able to save us from deceptively altered videos, known as deepfakes, according to a new report from Data and Society.

In the report, authors Britt Paris and Joan Donovan put deepfakes on a long continuum of media manipulation and say that they require social and technical fixes.

“The panic around deepfakes justifies quick technical solutions that don’t address structural inequality,” Paris told The Verge. “It’s a massive project, but we need to find solutions that are social as well as political so people without power aren’t left out of the equation.”

“The relationship between media and truth has never been stable,” the report reads.

The authors cite the actions of media companies during the Gulf War, saying they misrepresented events on the ground by selectively editing images from evening news broadcasts.

“These images were real images,” the report says. “What was manipulative was how they were contextualized, interpreted and broadcast around the clock on cable television.”

Fear about the potential for deepfakes to spread misinformation has increased as the technology itself has advanced. Some worry it could wreak havoc during the 2020 elections.

Most media attention about deepfakes has focused on prominent public figures and lawmakers, such as House Speaker Nancy Pelosi (D-Calif.), but the authors write that private citizens will eventually be harmed by the technology.

Although researchers are looking into technological solutions and Facebook recently released a dataset to allow the testing of new models aimed at detecting deepfakes, the Data and Society report warns against relying solely on Big Tech.

“More encompassing solutions might be to enact federal measures on corporations to encourage them to more meaningfully address the fallout from their massive gains in the past 15 years,” the authors write in the report’s conclusion.

The entire report can be viewed here.

Source: Read Full Article