Apple suspends its Siri request grading programme
News + Trends

Apple suspends its Siri request grading programme

Luca Fontana
2.8.2019
Translation: machine translated

Apple wants to stop analysing conversations recorded by Siri. But why? According to an insider, Apple's subcontractors have recorded conversations containing sensitive data relating to sex or drug trafficking, for example. Let's take a closer look!

For the time being, Apple is refusing to allow Siri recordings to be analysed by third parties, according to statements made by an Apple spokesperson to The Verge, a US portal that covers technology news, information and media. In addition, Siri users will, in future, be asked to explicitly consent to their recordings being evaluated by outsourcers.

We strive to improve the functionality of Siri while protecting the privacy of our users. We have decided to globally suspend our Siri request grading program. Siri users will be able to consent to participate in the analysis as part of a future software update.
Apple, lors de son entretien avec The Verge

Apple is responding to a recently published article in the Guardian, a UK daily newspaper. According to an insider, external Apple contractors obtained personal - sometimes sensitive - information from Siri users via Siri recordings. Similar reports in recent months about other providers of voice-control computer applications yesterday attracted the attention of the Federal Data Protection and Freedom of Information Commissioner.

Alexa, Google Assistant and Siri: what's happened so far

Google Home for the Google Assistant voice command app
Google Home for the Google Assistant voice command app
Source: Shutterstock.com

Voice assistants are designed to make everyday life easier, particularly in the field of home automation. Depending on the settings, they listen continuously or only after activation by a trigger word, in Apple's case, by the voice command "Dis Siri".

For several months now, reports have been piling up of concern for the privacy of voice assistant users. And with good reason, some voice recordings by Siri and co. are later transcribed by employees, which would make it easier to analyse accidentally activated discussions, as providers point out. The knowledge acquired by employees is supposed to improve their understanding of languages, dialects and accents, with the aim of correctly interpreting requests.

24 April 2019: Amazon Alexa connects voice recordings

In April 2019, Bloomberg revealed that Amazon employees have access to Alexa records, which are directly linked to users' addresses and phone numbers. Due to subsequent abuse allegations, Amazon has reportedly severely restricted employee access rights, according to Bloomberg.

23 May 2019: Amazon Alexa stores transcripts

A month later, Amazon confirms to US Senator John Coon that voice recordings made by Alexa and the corresponding transcripts will only be deleted at the behest of their users. As to the reason for his request, John Coon states:

Americans deserve to understand how their personal information is being used by technology companies. I will continue to work with consumers and businesses to find the best way to protect Americans' personal information.
John Coon, sénateur américain

From Amazon's response letter, it appears that there is no lifespan for data, even if it is several years old. Theoretically, it would be possible to filter user preferences from this content for advertising purposes. Amazon also reserves the right, according to its own statements, to store and not delete information about actions Alexa has taken, e.g. shopping, listening to music, etc.

10 July 2019: Google Assistant leaks

According to information leaked in Belgium early July, it would appear that Google employees are analysing every 500th voice command. More than 1,000 audio fragments were sent to Belgian radio station VRT by an insider, including many recorded calls that were not deliberate and phone calls with intimate content.

Many of these fragments made via Google Home devices or the Google Assistant on Android smartphones were accidentally recorded: requests for information on the weather forecast, shop opening times or even pornographic content. For example, the giant managed to record the voices of the grandchildren of a Flemish grandparent couple in the background. As their address is mentioned in the recording, the radio station was able to contact their grandson and verify the authenticity of the recordings.

The insider, who transmitted the recordings to the radio station, is believed to be an external contractual partner of Google, a language expert.

July 26, 2019: The Guardian on insiders at Apple

Although Apple doesn't explicitly disclose it in its privacy documentation, a small portion of Siri recordings are passed on to contractors who work for the Apple firm around the world. The task of these contractors is to transcribe the recorded Siri voice commands and analyse them. So explains an insider to the Guardian. As it was sometimes very unpleasant for some external employees to listen to and analyse such audio recordings, he wrote to the British newspaper.

There were several instances of private discussions, confidential matters, criminal machinations and people having sex.
D'après le témoignage de notre initié au Guardian

Also according to him, the recordings - although all iOS devices support Siri - have mainly come from the Apple Watch or the HomePod smart speaker. Most recordings of sensitive content have been triggered accidentally, for example by the noisy manipulation of a zip ("Say Siri").

August 1, 2019: the Federal Commissioner for Data Protection and Freedom of Information gives us his report

Following the revelations of recent months, the Federal Commissioner for Data Protection and Freedom of Information is speaking out. His press release deals mainly with the eponymous firm's Google Assistant voice assistant.

The use of automated voice assistants from providers such as Google, Apple and Amazon is proving to be very risky for the privacy of the individuals concerned. This applies not only to the people who use a voice assistant, but also to everyone who comes into contact with it, to those who live in the same household for example where voice assistants are used.

[...]

Against this background, the Federal Data Protection and Freedom of Information Commissioner has initiated administrative proceedings to prohibit Google from carrying out such analyses by employees or third parties for a period of three months.

Google has in fact confirmed to The Verge that it is in active contact with this data protection authority. Their aim is to help public authorities and users understand how voice recordings are analysed and intuitiveness, improved.

Google also confirms that recordings and third-party analysis of recordings will stop during the three-month investigation phase. At least in the EU.

2 August 2019: Apple gives in

In the Hamburg Authority's statement, other voice assistant providers - including Apple and Amazon - are also "invited to promptly review their policies".

Apple is the first digital voice assistant provider to announce that it explicitly asks users for permission to listen to recordings made by its employees. Until the software update requesting consent is available, Apple will no longer have third-party Siri recordings evaluated worldwide.

9 people like this article


User Avatar
User Avatar

I'm an outdoorsy guy and enjoy sports that push me to the limit – now that’s what I call comfort zone! But I'm also about curling up in an armchair with books about ugly intrigue and sinister kingkillers. Being an avid cinema-goer, I’ve been known to rave about film scores for hours on end. I’ve always wanted to say: «I am Groot.» 


These articles might also interest you

Comments

Avatar