Contractors of Apple listen to your Siri

According to a fresh study from The Guardian, Apple pays contractors to listen to recorded Siri discussions, with a former contractor revealing that employees have heard accidental recordings of the private life of customers, including appointments from physicians, addresses, and even possible drug dealings.

Siri interactions are sent to employees, according to that contractor, who listen to the recordings and are requested to grade it for a multitude of variables, such as whether the request was deliberate or a fake positive that inadvertently triggered Siri, or whether Siri’s answer was helpful.

But Apple doesn’t really say it has other people listening to the recordings, and whatever admissions it makes to that end are probably buried deep within a privacy policy that few (if any) Siri consumers have ever read. Apple notes on its privacy page that ” To help them recognize your pronunciation and provide better responses, certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols, ” but there is no mention of human employees listening to and analyzing that data.

The firm recognized in a declaration to The Guardian that ” A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements. “Apple also observed that this scheme analyzes less than 1% of daily activations.

The fact that people are generally listening to voice assistant recordings is not a breakthrough— both Amazon (for Alexa) and Google (for Assistant) have been found to have comparable systems where real human employees are listening to recorded discussions to enhance those systems better. It makes sense: intelligent assistants obviously cannot tell the difference between false positives and actual queries (if they could, it wouldn’t be a fake positive), and anyone using a smart assistant can tell you that, at this point of their evolution, false positives are still very, very prevalent.

But for all three of these firms, the extent to which these companies listened to clients was not evident until recently.

For a few reasons, like the pervasiveness of Apple products, Apple’s system may also be a bigger concern. Where Alexa is mainly restricted to intelligent speakers and Google Assistant to speakers and telephones, Siri is also on Apple’s extremely famous Apple Watch, which every waking moment is on millions of people’s wrists. Moreover, Siri on an Apple Watch activates whenever a user raises his wrist, not just when he believes he’s heard the term wake sentence “Hey, Siri.”

According to the Guardian’s, that proliferation has led to some very personal conversations making their way to complete strangers working for Apple: ” There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

Additionally, as The Guardian notes, while Amazon and Google enable clients to opt out of certain uses of their recordings, Apple does not give a comparable privacy protection option outside of completely disabling Siri. That’s a particularly bad look as Apple has built so much of its reputation on selling itself as the privacy firm that defends your data in ways Google and Amazon don’t. Implicitly telling customers that ” the only way to have peace of mind that a random stranger won’t listen in on their accidentally triggered Siri recordings is to stop using Siri entirely ” is a bit of a mixed message from Apple that is supposed to put privacy at a premium.

Short of stopping the use of smart assistants altogether, there is probably not much that Siri customers will be able to do to avoid the issue, other than being careful about what they say about their iPhones and HomePods (unless the public pressure causes Apple to add an opt-out option). Still, it’s a nice reminder that you often give up much more privacy than you believe when you agree to use these products.

If you found the newsroom and insights pages useful then feel free to subscribe to our newsletter and get the latest in your inbox.