Apple announced Friday it would stop using contractors to listen in on users through Siri to grade the voice assistant's accuracy.
An Apple whistleblower had told the Guardian that the contractors responsible for "grading" the accuracy of the digital assistant regularly overheard conversations about doctors' appointments, drug deals and even couples having sex. Their job was to determine what triggered Siri into action - whether the user had actually said, "Hey, Siri" or if it was something else, such as the sound of a zipper.
Apple said it would suspend the global analysis of those voice recordings while it reviewed the grading system. Users will be able to opt out of reviews during a future software update.
"We are committed to delivering a great Siri experience while protecting user privacy," said Cat Franklin, an Apple spokeswoman, in an email to The Washington Post.
Many smart-speaker owners don't realise that Siri, Amazon's Alexa and, until recently Google's Assistant, keep recordings of everything they hear after their so-called "wake word" to help train their artificial intelligences. (Amazon founder Jeff Bezos owns The Washington Post). Google quietly changed its defaults last year, and Assistant no longer automatically records what it hears after the prompt "Hey, Google."