Siri

Apple apologises for ‘grading’ programme that let contractors listen in on Siri users

Cupertino says work will resume in autumn after bringing work in-house
Life
Image: Apple

29 August 2019

Nearly a month ago, a report in The Guardian exposed the fact that third party contractors had been listening in on a small percentage of Siri requests as part of a ‘Siri grading’ programme. Apple promised to halt the Siri grading programme while it conducts a “thorough review,” which left many wondering how the company would move forward, as human grading of any machine learning process is an essential part of training the algorithms to improve them.

Apple now appears to have finished its review and has issued a statement apologising for the way this programme had been carried out so far. The company plans to reinstate the programme this autumn after making sweeping changes.

The apology begins with a familiar statement: “At Apple, we believe privacy is a fundamental human right.” It then describes how Apple designed Siri to protect your privacy – collecting as little data as possible, using random identifiers instead of personally identifiable information, never using data to build marketing profiles or sell to others.

The statement then goes on to make sure you understand that using your data helps make Siri better, that “training” on real data is necessary, and only 0.2% of Siri requests were graded by humans.

After all of this, Apple does get around to the actual apology that should have been in the first paragraph. To quote the statement in full:

As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that we apologise. Apple will resume the grading programme after making the following changes:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve. 
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time. 
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is also making the programme opt-in rather than opt-out, an important distinction as the vast majority of users never stray from the default settings. It’s also going to make sure these audio samples stay in-house rather than going into the hands of third party contractors.

Hopefully, this spotlight on Siri’s training, evaluation, and grading will have a positive effect not only for user privacy, but for helping Siri to improve more quickly.

IDG News Service

Read More:


Back to Top ↑

TechCentral.ie