An artificial intelligence-powered ‘nudifying’ app, which ‘undresses’ women in photos, has exploded in popularity and drawn widespread criticism.
DeepSukebe, a website which promises to ‘reveal the truth hidden under clothes’, went live last year, and is reportedly receiving millions of visits each month.
But politicians have called into question to the legality of digitally generated nude images, with MP Maria Miller calling for a parliamentary debate on whether the images should be banned.
The tool has been used to ‘nudify’ multiple female celebrities using an ‘AI-leveraged nudifier’, with developers for the site saying they are looking to improve the technology further.
It’s not the first time AI has been used for illicit purposes – another nudifying tool, called DeepNude, was launched in 2019, but quickly withdrawn after the potential for abuse became clear.
However, tools using some of the DeepNude code are still in circulation.
A similar technology, ‘deepfake’ images, which uses AI to superimpose victims’ faces onto existing nude bodies, has also been widely criticised, with a British government review this year calling for the technology to be banned.
‘Parliament needs to have the opportunity to debate whether nude and sexually explicit images generated digitally without consent should be outlawed,’ MP Maria Miller told the BBC.
‘I believe if this were to happen the law would change.’
The law currently doesn’t reflect the ‘severity of the impact on people’s lives’ and the creators of software that does harm should be held accountable, added Miller.
The government is currently drafting the Online Safety Bill, which will seek to introduce stricter controls on social media companies to protect users, and to reduce the volume of harmful content online.
A provision in the bill to criminalise ‘nudifying’ tools could help protect victims, argues Miller.
Campaign groups argue that it is currently up to the victims of online abuse to remove harmful images from the internet, an undue burden on people who are often already traumatised.
Miller has focused on tackling online ‘revenge porn’, where nude images are shared online without the victim’s consent, and says that the latest technology needs to be regulated.
‘At the moment, making, taking or distributing without consent intimate sexual images online or through digital technology falls mostly outside of the law,’ said Miller.
‘It should be a sexual offence to distribute sexual images online without consent, reflecting the severity of the impact on people’s lives.’
Source: Read Full Article