Science

Who ISN'T listening? Report claims Apple contractors often hear private conversations from Siri


Who ISN’T listening? Apple contractors often hear private conversations about sex, health and money picked up by Siri, after similar accusations against Amazon and Google

  • Apple will join the list of big tech companies listening in on its customers
  • Voice snippets dictated to Siri are analyzed by contractors say a new report
  • Apple says that the recordings are assessed to improve Siri’s accuracy
  • Private audio scraped by the program include medical conversations and sex
  • Apple will join Google and Amazon who operate similar programs 

It’s official: if you’re using a voice-assistant — pretty much any voice-assistant — someone could be listening in.

The Guardian reports that Apple has joined an ever-growing list of tech companies that listens in on commands uttered through its virtual voice-assistant. 

Snippets of audio, reports The Guardian, are sent to contractors who are responsible for listening and grading them for accuracy, including whether or not the command was accidental or whether its assistant, Siri, was able to complete the task.

As is the case with other similar programs from Google and Amazon, however, a whistleblower says the program has inadvertently swept up audio data that most might find confidential. 

Those include, according to an unnamed source in the report, conversations between patients and doctors, sex, criminal activity, and official business talk.

Apple has been listening on users' Siri voice commands according to a recent report by The Guardian. Apple's HomePod is pictured

Apple has been listening on users’ Siri voice commands according to a recent report by The Guardian. Apple’s HomePod is pictured 

In a statement to The Guardian, Apple admitted to its previously unreported practice.

‘A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID,’ said the company.

‘Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.’ 

A source tells The Guardian that the devices that most frequently scrape up audio not intended for Siri’s consumption are Apple’s Watch and its Homepod, a smart speaker and home assistant.   

According to the source, ‘The regularity of accidental triggers on the Watch is incredibly high.’ 

Depending on user settings, the Apple Watch activates Siri when the watch is raised, meaning subtle motions may potentially wake the device’s microphone.

Apple has never before disclosed its practice of using human contractors to pore over Siri audio.

The biggest offender of inadvertently scraping audio, according to a source, is the Apple Watch.

The biggest offender of inadvertently scraping audio, according to a source, is the Apple Watch.

While the contractors were reportedly encouraged to report accidental triggers, the process was purely a technical one and they were not given procedure on how to handle sensitive information harvested by the company.

Apple says that all of the information is anonymized, but sources quoted by The Guardian say the intrinsically personal nature of some of the recordings puts that anonymity at risk.

‘There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,’ said the Source. 

‘It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.’

Whistleblowers say that Apple should be more upfront about their practices and should remove jocular pre-loaded responses from Siri about whether or not the assistant is listening in on conversations.

Currently, Siri’s response to ‘are you spying on me’ or ‘are you always listening to me’ is ‘I only listen when you’re talking to me.’ 

Apple has joined the list of major tech giants who report using voice commands from their users to improve virtual assistants. 

Recently, a whisteblower reported that Google had inadvertently collected private audio during its program, including porn searchers and private conversations. 

Last year, Amazon was discovered to have collected audio from users of its Echo smart speaker with similar results. 

In the case of Apple, however, news may come as more of a shock to its customers given that the brand has often touted its commitment to user privacy, offering proprietary encryption on many of its products that the company purports cannot be read by even those at the company. 

WHY ARE PEOPLE CONCERNED OVER PRIVACY WITH AMAZON’S ALEXA DEVICES?

Amazon devices have previously been activated when they’re not wanted – meaning the devices could be listening.

Millions are reluctant to invite the devices and their powerful microphones into their homes out of concern that their conversations are being heard.

Amazon devices rely on microphones listening out for a key word, which can be triggered by accident and without their owner’s realisation. 

The camera on the £119.99 ($129) Echo Spot, which doubles up as a ‘smart alarm’, will also probably be facing directly at the user’s bed. 

The device has such sophisticated microphones it can hear people talking from across the room – even if music is playing. 

Last month a hack by British security researcher Mark Barnes saw 2015 and 2016 versions of the Echo turned into a live microphone.

Fraudsters could then use this live audio feed to collect sensitive information from the device.   

 



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.