The Public Slate

A community-driven resource adding intelligence to your life

Apple: Oops, Sorry For Listening To Your Conversations

Apple issued an apology Wednesday after news broke of hired contractors listening into conversations as a way to better train “Siri”, Apple’s built-in voice assistant. 

“We realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies.”

The apology came after allegations of contractors listening to people having sex and engaging in criminal activity wrote The Guardian.

According to Apple, the contractors were hired to grade Siri’s responses to voice-based requests from iPhone users based on a variety of factors. Apple said that only a fraction of conversations was analyzed and, among those that were, no personal information was recorded or disclosed.

The problem is Apple never disclosed to the public that humans are listening to a portion of conversations captured by the Siri voice assistant.

“A whistleblower working for the firm, who asked to remain anonymous due to fears over their job, expressed concerns about this lack of disclosure, particularly given the frequency with which accidental activations pick up extremely sensitive personal information,” wrote The Guardian.

Apple’s changes

From Apple’s official response:

As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects its data and has strong privacy controls in place. Those who choose to participate will be able to opt-out at any time.
  • Third, when customers opt-in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top