Apple to ensure its staff never listen to you having sex or dealing drugs

Apple has suspended a controversial practice in which human contractors listened to audio recordings of users interacting with its voice assistant Siri.

The decision comes after a newspaper report alleged people tasked with reviewing the recordings regularly heard confidential information and private conversations.

It was claimed these recordings captured Apple fans having sex, dealing drugs or talking about confidential medical information.

In an effort to perform quality checks and improve the voice assistant’s responses, contractors graded Siri’s answers to user queries, The Guardian reported.

They also looked at whether the response was triggered accidentally, without a deliberate query from the user, the newspaper said.

‘While we conduct a thorough review, we are suspending Siri grading globally,’ an Apple spokeswoman said in a statement, adding that in a future software update, users will be able to opt out of the program.

Siri, Apple’s iconic voice assistant, allows users to work their iPhone without using their hands, and can send messages, make calls and open multiple applications with voice commands alone.

Consumers have become accustomed to calling out names for popular voice assistants, such as Amazon’s Alexa and Google’s Assistant.

An anonymous whistleblower told the Guardian that Siri is often accidentally triggered by things that sound like the ‘Hey Siri’ wake command, such as a zip being undone.

‘There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,’ the source told the paper.

‘These recordings are accompanied by user data showing location, contact details, and app data.’

The disclosure is likely to cause embarrassment for Apple, which prides itself on protecting the privacy of its users.

The technology giant did admit that part of its quality control involved maintaining and improving the virtual assistant as part of a ‘grading’ scheme. Such as learning when the assistant was set off in error.

However, Apple said that only a random subset – less than 1% of daily Siri activations – were part of this grading scheme, and each of these was just a few seconds long.

Apple said: ‘A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.’

The source told the Guardian that the Apple Watch and HomePod smart speaker were weak links in the chain, saying that ‘you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.’

Fellow tech giants Amazon and Google have both recently confirmed they also use small samples of user recordings with their own voice assistants to train and develop its language recognition software.

The two firms each have a virtual assistant present in smart speakers and some smartphones, and both confirmed they use human auditors to analyse a small section of recordings from users.

Source: Read Full Article