This week, Amazon announced that it was working on technology to allow its voice assistant, Alexa to sound like actual humans dead or alive.
The system would let Alexa mimic any voice after hearing less than a minute of audio, said Rohit Prasad, an Amazon senior vice president, at a conference the company held in Las Vegas on Wednesday.
The goal is to ‘make the memories last’ after ‘so many of us have lost someone we love’ during the pandemic, Prasad said.
While Amazon is positioning this technology as a sentimental tool, Twitter users found it ‘creepy’. Not to mention the security implications of such a feature.
‘Our voices are often used as a password to authenticate certain accounts, so to mimic a particular voice from just 60 seconds could result in serious security implications,’ said Jake Moore, Global Cybersecurity Advisor at ESET.
Amazon is hardly the first to tinker with voice and AI. Microsoft recently restricted which businesses could use its software to parrot voices fearing deepfake concerns.
‘Deep fake audio attacks are already happening against businesses but they are often created by powerful computers utilising lots of data input. When tech giants add gimmicky features for the masses, it opens up the threat level to many more people,’ explained Moore.
Giving AI assistants the voice of actual people could be a case of technology outpacing security and putting people at risk.
‘If anyone’s voices really can be duplicated this easily and quickly, there could be some potentially serious incidents on the horizon. Companies need to ask themselves why we might need this technology rather than creating it just for creating sake,’ said Moore.
*in Amazon Alexa voice*
We’re sure this will be fine https://t.co/MRwmSvNYai pic.twitter.com/3VcvLanJEd
Last year, HSBC said that telephone banking fraud has been reduced by 50% since the introduction of a biometric security system that authenticated customers through their voices.
Now, what if Alexa could mimic those voices? It’s easy to imagine the potential security nightmare it could cause.
If this technology does come to pass, experts advise that it might be wise to switch from using your voice to authenticate your bank accounts to another verification method such as online banking via your smartphone.
Source: Read Full Article