Apple Apologizes For Listening To Siri Recordings

Apple has become the latest tech firm to be embroiled in a privacy scare over workers listening to recordings from digital personal assistants.

The iPad maker has issued a formal apology for using human contractors to listen in on conversations between its Siri and users. It pledged to change its policies going forward.

Apple is not alone in being at the centre of privacy concerns. Earlier this month Microsoft and its Skype VoIP application were revealed to be using human contractors to occasionally listen to real Skype conversations that have been processed by its translation software.

Siri recordings

In a statement, Apple owned up to its practices surrounding Siri and detailed how it will change things going forward.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process – which we call grading,” said Apple. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”

Apple said that when Siri data is stored on its servers, it doesn’t use it to build a marketing profile and it would never sell it to anyone.

“As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that we apologise,” said Apple. “As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users – but only after making the following changes.”

Apple said that from now by default it will no longer retain audio recordings of Siri interactions.
Secondly it allow users to opt in to help Siri improve by learning from the audio samples of their requests.

And thirdly Apple said that when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions.

“Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri,” the firm pledged.

Privacy scares

There have been a series of privacy scares of tech firms listening into people’s conversations with their personal assistants.

It was reported earlier this year that a global team of people at Amazon reviewed audio clips of people speaking to their Alexa-powered smart speakers, to help improve its functionality.

Jitters were raised again about Amazon again in May when the e-commerce giant filed a patent that would allow Alexa to record everything a person says, before a command word is actually issued.

In May Amazon was hit with two lawsuits alleging that its Alexa-powered smart speakers are recording children.

Amazon did not help matters last month when it admitted in a letter to a US senator that it keeps Alexa user voice recordings indefinitely.

Google was also dragged into it when it admitted in July that it uses ‘language experts’ around the world to study small ‘snippets’ of user recordings gained from Google Home smart speakers.

It along with Apple (with its Siri assistant) have now suspended reviewing voice recordings from users.

Can you protect your privacy online? Take our quiz!

Tom Jowitt

Tom Jowitt is a leading British tech freelancer and long standing contributor to Silicon UK. He is also a bit of a Lord of the Rings nut...

Recent Posts

UK’s CMA Readies Cloud Sector “Behavioural” Remedies – Report

Targetting AWS, Microsoft? British competition regulator soon to announce “behavioural” remedies for cloud sector

11 hours ago

Former Policy Boss At X Nick Pickles, Joins Sam Altman Venture

Move to Elon Musk rival. Former senior executive at X joins Sam Altman's venture formerly…

14 hours ago

Bitcoin Rises Above $96,000 Amid Trump Optimism

Bitcoin price rises towards $100,000, amid investor optimism of friendlier US regulatory landscape under Donald…

15 hours ago

FTX Co-Founder Gary Wang Spared Prison

Judge Kaplan praises former FTX CTO Gary Wang for his co-operation against Sam Bankman-Fried during…

16 hours ago