Security

Apple Siri whistleblower pushes EU for more reforms on voice recording tech


apple-iphone-se-1337

A former Apple contractor is pushing EU regulators to investigate the tech giant.


Angela Lang/CNET

Last year, a former Apple contractor made waves by raising concerns about how the tech giant handled Siri voice assistant recordings, ultimately leading the tech giant to cease listening to them without your permission. Now the whistleblower is back, having sent a letter to European regulators asking them to investigate and potentially punish the company.

“I am extremely concerned that big tech companies are basically wiretapping entire populations despite European citizens being told the E.U. has one of the strongest data protection laws in the world,” former Apple contractor Thomas Le Bonniec wrote in a Wednesday letter to EU regulators. “Passing a law is not good enough: it needs to be enforced upon privacy offenders.”

The letter is the latest example of the fine line that tech companies must walk, balancing the need to use those recordings to improve the effectiveness and smarts of voice assistants with the need to protect the privacy of their customers. Similar criticisms and concerns have been lobbed at Google and Amazon over how they deal with their voice assistants. 

Apple didn’t immediately respond to a request for comment about the letter.

Le Bonniec was a contractor with Apple until he quit in 2019, raising ethical concerns with the UK’s Guardian newspaper about how the company was behaving. Le Bonniec said Apple collected and transcribed some voice recordings collected by Siri in an effort to improve the service’s quality. But, he said, the recordings invaded people’s privacy without their knowledge, including recordings of medical diagnoses, sexual encounters and intimate moments.

“I listened to hundreds of recordings every day, from various Apple devices (e.g. iPhones, Apple Watches,or iPads),” Le Bonniec said in his letter. “The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and whoever could be recorded by the device. The system recorded everything: names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever.”

Apple last year said the recordings were analyzed in secure facilities and didn’t have any additional information attached to them, such as whose account they came from. “All reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple said at the time. The company also promised to change the way it handled Siri, explicitly asking people for consent to share their recordings with the company’s teams, and giving them an option to opt out.

See also: Best iPhone VPNs of 2020


Now playing:
Watch this:

Let’s talk about why privacy settings are a problem



4:10



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.