Time and again, we’ve come to know even though the voice assistants are fun to use and might get some work done, but when it comes to privacy, their control over our private life is questionable. Recently, Microsoft’s Cortana blunder came to life, along with Google whose assistant was also nosy. Now, Apple’s Siri also peeks into your private conversations, if a report is to be believed.
Apple has always portrayed itself as a pro-privacy firm but according to Irish Examiner, Siri is being better with your private recordings. The Guardian had also reported that Apple contractors listen to very personal conversations like ‘confidential medical information, drug deals, and recordings of couples having sex,’ and so on. This is truly alarming.
To make voice assistants more user-friendly, such firms often use your existing queries to make the AI smarter. Siri records snippets of your voice requests back to Apple, which will be studied so that the firm can make it better, that will sway more customers to use it.
A contractor, however, had something different to say. He said his job involved noting when Siri could actually help or if Siri was triggered accidentally. He also added that the recordings they occasionally listened to had personal data or snippets of conversations.
Apple reacted to this controversy saying that it is working closely with our partners as it reviews its processes around grading Siri conversations. This method is not earth-shattering, though. Microsoft contractors also transcribe Cortana recordings to make it better and more sound. A Microsoft contractor told Vice that they are expected to transcribe and classify three tasks every minute, which means Apple’s contractors, who are transcribing close to 1000 voice commands every day will do so about two per minute, assuming they were working an eight-hour day.