Apple halts Siri ‘grading’ program, promises opt-out in upcoming software update

Apple halts Siri ‘grading’ program, promises opt-out in upcoming software update

The right response

Credit: Apple

In the wake of backlash over a Guardian report that exposed employees who were tasked with analyzing Siri recordings for accuracy and quality, Apple has announced it is temporarily suspending the program as it decides how to proceed.

In a statement to TechCrunch, an Apple spokesperson said the company is “committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally.”

Apple added that users will have the ability to choose whether they want to participate in the program as part of an upcoming software update.

Apple’s Siri grading process was exposed last month when one of the contractors contacted the Guardian claiming that they “regularly hear confidential medical information, drug deals, and recordings of couples having sex” as part of their job. Apple explained to the Guardian that the data collected “is used to help Siri and dictation … understand you better and recognise what you say.”

Apple also said the recording are anonymised and represent less than 1 percent of daily Siri activations. It also said recordings were “not associated with the user’s Apple ID,” though the employee said they “are accompanied by user data showing location, contact details, and app data.”

According to the “whistleblower,” recordings routinely contain snippets of conversations recorded by accidental triggers of the “Hey Siri” wake word. It’s unclear whether these recordings are supposed to be deleted before they reached the employee’s ears. It’s also unknown how long Apple has been running the grading program.

But while the practice might be necessary, the seeming secrecy of it is alarming. Nowhere in Apple’s privacy policy or Siri setup is it mentioned that recordings may be used for quality control, nor is there a toggle that lets you opt out of data collection. According to the statement, Apple will presumably be rectifying both of these issues once it reinstates the program.

That’s the right response. Customers should be aware that their Siri recordings may be listened to, and part of Apple’s privacy push should be the ability to keep your data to yourself. We’d also like to see an easier way to see and delete your Siri history, as well as a better way to filter out accidental recordings, but for now, a toggle is a good start.

Follow Us

Join the newsletter!


Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags Apple

Show Comments