“We Haven’t Been Fully Living Up to Our High Ideals,” Apple Says on the Siri-Contractor Issue
Several mega tech companies, including Apple, came under fire when an investigation revealed how data from voice assistants was sent to contractors to evaluate user responses. At times, this included highly private and intimate data that could be tied to a user despite the anonymization processes. In response, Apple has now apologized and outlined the steps it’s taking to protect user data.
“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process – which we call grading,” Apple wrote today. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies.”
Apple says that the Siri data stored on its servers isn’t used “to build a marketing profile and we never sell it to anyone.” The company added the following:
Siri uses a random identifier – a long string of letters and numbers associated with a single device – to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number – a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device’s data is disassociated from the random identifier.
The iPhone maker had announced earlier that it is suspending the grading process as a result of the media reports. Apple said that the grading process “involved reviewing a small sample of audio from Siri requests – less than 0.2 percent – and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability.”
“We’ve decided to make some changes to Siri” – Apple
Apple says it has now reviewed its processes and has realized that it hasn’t “been fully living up to” its “high ideals.” The company will resume the grading process later this fall.
Users will receive a software update later this year, bringing some changes (mentioned below). After this update has been delivered to users, Apple will restart its Siri grading process. The changes that it plans to introduce include (emphasis is ours):
- First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
- Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
- Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
One notable change is Apple using employees, not contractors, during the review process. This is critical because a company has more control over security with its own employees. Once the data leaves the building, Apple can only hope so much to have its contractors implement a similar level of the protective measures.
This review will also only happen if users opt in to the process. Otherwise, by default, user data won’t be reviewed at all. This is MASSIVE because, by default, users won’t have to worry about anything. In comparison, other companies, including Amazon, focused more on opting out than opting in.
Apple users take pride in buying products from a company that focuses more on data security than possibly any other company. But with stories like this, it brings up question whether there are other aspects of the Apple ecosystem that are putting user data at risk.
While Apple did take quick steps to remedy the situation, it should have never required a news report to push the company to do the right thing. It has to be said that Siri is probably the most privacy focused voice assistant. But, with Apple users expecting more from the company, it is hoping that the Siri maker will be more transparent about processes where humans have access to the user data.