July 30, 2019 |
Apple's Siri Listens to You Having Sex for âQuality Controlâ |
If you, like half a billion other people in the world, use an Apple iPhone, chances are that at some point your phone, via the “Siri” assistant, is eavesdropping on you, possibly even recording you having sex, or discussing medical conditions with your doctor, or some other activity—even committing crimes, if you’re that sort of person—you would rather not broadcast, according to an exposé by The Guardian newspaper last week. Not only have you probably been recorded without your knowledge, but someone on the other end listened to that recording, in at least “a small portion” of cases, according to a statement by Apple to the Guardian. Apple transmits the recordings to contractors around the world, who listen in for the purpose of “quality control.” A whistleblower among those contractors revealed to the Guardian personal knowledge of "private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” according a Forbes.com report on the exposé. But that’s not all. The surreptitious recordings come accompanied by the eavesdropping victim’s “location, contact details, and app data,” according to the whisteblower—though Apple maintains that user data is strictly kept separate from the Siri audio recordings. Apple claims that only 1 percent of all Siri recordings are used by the “quality control” contractors, whose purpose is to improve Siri’s responsiveness to user questions—as well as to determine how and why Siri was activated, because the assistant frequently “wakes” by accident. But as Forbes points out, with 500 million Siri users likely being recorded multiple times each day, 1 percent would be an extremely large number of secret recordings, potentaily millions every day. “Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple said in its statement to the Guardian. The contractors are instructed to report accidental Siri activations—Siri on the Apple Watch can be activated simply by a user moving the arm wearing the watch—but the company provides no specific procedures to correct the problem, according to the Guardian. Apple’s privacy disclosures for users do not reveal that human contractors may listen to Siri recordings, but the tech magazine MacWorld responded to the Guardian exposé with a simple solution: Apple, MacWorld suggests, must include an “opt out” option, which would be activate by a simply toggle switch in an iPhone’s settings—just as Apple now offers for location tracking, microphone access and many other functions. In the meantime, deactivating the “Hey Siri” option remains the only way to prevent surreptitious recordings of intimate moments and personal information. But that option renders Siri unusable altogether, making it a less-than-ideal alternative. Photo By Oliur Rahman / Wikimedia Commons
|