Site icon Stuff South Africa

Have you heard? The hazards of voice assistants

Google Assistant

Shock. Horror. Your privacy has been invaded. That’s if you use a voice assistant to help you with everyday tasks. But, unlike other privacy scandals, this is one the owners of smart speakers should’ve seen coming.

Led by Amazon’s Alexa (which runs through its Echo speakers), the voice assistant market has exploded in recent years as these so-called smart speakers have popped up in homes all over the world, including from Google and Apple. But all smartphone owners can use Google Voice Assistant (on Android) or Siri on iPhones and other Apple products. Facebook has an opt-in service that transcribed voice chats – which has been suspended in Europe after this audio was “mistakenly” sent to human contractors.

The power of voice assistants has been trumpeted by the companies that offer them, but it turns out all of the big tech firms, despite saying they don’t listen to them, actually are. In the past few months, it has emerged that the software that runs these voice assistants, like all software, needs some help from humans with that most particular of human problems: understanding other humans.

The premise is simple: instead of picking up a smartphone, you merely tell your smart speaker to perform the task for you. Typically, these are simple things like telling you the weather report, reading news headlines, Googling information (including recipes because many of these devices are in the kitchen) or setting reminders.

To cue the smart speaker that you’ve got a specific request, you say the activation phrase, which (depending on the service) is: “Hey Alex/Ok Google/Hey Siri” and it knows you’re speaking to it.

The only problem is that those speakers are always listening. The makers argued this was to build up the voice recognition systems but there are innumerable problems with that. The manufacturers assured their customers that no one else was listening and that those recordings would stay secret. Except they lied.

All of the major makers have been using humans to check the quality of voice recognition, a process called “grading,” often without the users knowing.

In arguably the most shocking instance, contractors hired by Apple heard recordings of people having sex, their confidential medical information and even drug deals, according to a whistleblower’s account in The Guardian.

“The sound of a zip, Siri often hears as a trigger,” the contractor told the London newspaper. “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

Apple has since apologised and stopped the process; while Google has suspended the reviewing in Europe (god bless the EU’s GDPR privacy legislation and its harsh fines). Having been the first service identified that was using human grading, Amazon has an option to opt-out for their recordings being reviewed.

This isn’t going to be the end of this scandal. Voice assistants are already the next big thing and will probably always need human grading to check they are doing their job properly. It’s yet another trade-off of privacy for convenience that now confronts potential users. I don’t think it’s worth it.

This column first appeared in Financial Mail

Exit mobile version