Apple has announced that your conversations with Siri won't go directly to Apple anymore, as it will stop retaining voice recordings by default. The tech giant implemented changes after a report in The Guardian claimed that contractors were regularly listening to conversations between people and Siri. According to the report, these conversations included details like a person’s name, medical records, and people having sex.
Apple has now ended the contractor programme, which means that at least 300 contractors have lost their jobs.
‘We focus on doing as much on device as possible, minimising the amount of data we collect with Siri. When we store Siri data on our servers, we don’t use it to build a marketing profile, and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private,’ Apple said in a statement.
Siri relies on data from your interactions with it, which includes the audio of your request and a computer-generated transcription of it. Apple sometimes uses the audio recording of a request, as well as the transcript, in a machine learning process that ‘trains’ Siri to improve.
Apple hired contractors to listen to Siri in a process called ‘grading,’ but it ended the programme after some consumers raised concerns.
‘As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that, we apologise. As we previously announced, we halted the Siri grading programme,’ added Apple.
Here are the changes made by Apple. First, by default, Apple will no longer retain audio recordings of Siri interactions. However, the company will continue to use computer-generated transcripts to help Siri improve. Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. Those who choose to participate will be able to opt-out at any time.
Third, when customers opt-in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. The team will work to delete any recording that they consider to be an inadvertent trigger of Siri.