A viral deepfake video of Saturday Night Live's Bill Hader flawlessly morphing into Tom Cruise has brought concerns around the weaponisation of deepfakes back to the fore.
The video shows Hader in conversation with David Letterman, and every time he does an impressions of Cruise – or later Seth Rogan – his face subtly changes into the actor's face.
This is not the first deepfake video to go viral. Last year, for example several deepfake porn videos emerged online, appearing to show celebrities such as Emma Watson, Gal Gadot and Taylor Swift in explicit situations.
Deepfakes have also been used to depict high-profile figures such as Donald Trump , Barack Obama and Facebook's Mark Zuckerberg making inflammatory statements.
However, the Bill Hader deepfake is the first one to show a face "shapeshifting" within a continuous clip, highlighting how sophisticated this technology has become.
Commenters on the video have expressed disquiet at the potential uses of this kind of AI tech.
"I'm equal parts impressed and equal parts terrified of this technology," wrote one commenter.
"I don't think anything has ever caught me more off guard… I shuddered," wrote another.
"Ctrl Shift ABORT!! This is beyond terrifying. Amazing. But terrifying," wrote another.
"Ok, so video evidence in a court of law just lost all credibility," wrote another.
"The world will not use this technology responsibly," wrote another.
The video's Slovakian creator has created and shared around 20 deepfake videos on his YouTube channel , including one of Jim Carrey morphing into Jack Nicholson's character in The Shining.
He claims his principal aim is to entertain, but that the videos are also intended to raise awareness in the age of fake news.
"I always mention that it's a deepfake in the title and description – I don’t want to mislead anyone," he told The Guardian .
However, some experts claim that the technology could easily be weaponised to spread fake news and propaganda, or discredit political rivals.
"Imagine when this is all properly weaponised on top of already fractured and extreme online ecosystems and people stop believing their eyes and ears," tweeted tech commentator Gavin Sheridan.
Meanwhile speech synthesis scientist Dr Matthew Aylett claims that work is already underway to develop deepfake audio, which is when voices are replicated or recreated by a computer to say anything the creator wants.
"What you see is less important than what you hear," said Dr Aylett, Chief Scientific Officer at Text-to-Speech specialist CereProc.
"A deepfake video, where misinformation is the aim of criminals or other bad actors, isn't harmful unless there is a message and it is believable. This isn't visual in most cases, it's auditory.
"Through weaponising this technology, you can manufacture conversations or statements that never happened.
"The possible criminal or fraudulent applications for are incredibly wide – anyone possessing the skills can spread fake news, influence staff by passing on fake commands from senior leadership, create false evidence to change the outcome of court cases, even blackmail innocent people."
Source: Read Full Article