Terry cOx
by on July 27, 2019  in Information Technology / Privacy Awareness /
0 rating 231 views 0 likes 0 blog comments
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.
Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.
0
Total votes: 0
Terry cOx
computer Geek By knowledge and Somali by Mind Set! powerfull will always dominate weak,thats the law of the nature.
Be the first person to like this.

It will be interesting:

by on August 20, 2021
0 rating 36 views 0 likes 0 blog comments
Read more
by on November 5, 2020
1 rating 86 views 1 like 0 blog comments
Read more
by on December 8, 2020
0 rating 89 views 0 likes 0 blog comments
Read more