Nearly a month ago, a report in The Guardian exposed the fact that third-party contractors have been listening in on a small percentage of Siri requests as part of a “Siri grading” program. Apple promised to halt the Siri grading program while it conducts a “thorough review,” which left us wondering how the company would move forward, as human grading of any machine-learning process is an essential part of training the algorithms to improve them.
Apple now appears to have finished its review and has issued a statement apologizing for the way this program had been carried out so far. The company plans to reinstate the program this fall after making some important changes.
The apology begins with a familiar statement: “At Apple, we believe privacy is a fundamental human right.” It then describes how Apple designed Siri to protect your privacy—collecting as little data as possible, using random identifiers instead of personally identifiable information, never using data to build marketing profiles or sell to others.
The statement then goes on to make sure you understand that using your data helps make Siri better, that “training” on real data is necessary, and only 0.2 percent of Siri requests were graded by humans.
After all of this, Apple does get around to the actual apology that should have been in the first paragraph.
This is the right move, and it once again puts Apple ahead of other tech giants in protecting your privacy and security. Apple is making the program opt-in rather than opt-out, an important distinction as the vast majority of users never stray from the default settings. It’s also going to make sure these audio samples stay in-house rather than going into the hands of third-party contractors.
Hopefully, this spotlight on Siri’s training, evaluation, and grading will have a positive effect not only for user privacy, but for helping Siri to improve more quickly.