Disaster Smart speaker recordings reviewed by humans - Can you believe it?

KeyserBroze

Sleep is for people who run out of Cocaine
kiwifarms.net


Amazon, Apple and Google all employ staff who listen to customer voice recordings from their smart speakers and voice assistant apps.
News site Bloomberg highlighted the topic after speaking to Amazon staff who "reviewed" Alexa recordings.
All three companies say voice recordings are occasionally reviewed by humans to improve speech recognition.
But the reaction to the Bloomberg article suggests many customers are unaware that humans may be listening.
The news site said it had spoken to seven people who reviewed audio from Amazon Echo smart speakers and the Alexa service.
Reviewers typically transcribed and annotated voice clips to help improve Amazon's speech recognition systems.
Amazon's voice recordings are associated with an account number, the customer's first name and the serial number of the Echo device used.
Some of the reviewers told Bloomberg that they shared amusing voice clips with one another in an internal chat room.
They also described hearing distressing clips such as a potential sexual assault. However, they were told by colleagues that it was not Amazon's job to intervene.

What did Amazon say?
The terms and conditions for Amazon's Alexa service state that voice recordings are used to "answer your questions, fulfil your requests, and improve your experience and our services". Human reviewers are not explicitly mentioned.
In a statement, Amazon said it took security and privacy seriously and only annotated "an extremely small sample of Alexa voice recordings".
"This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone," it said in a statement.
"We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow."

What about Apple and Siri?
Apple also has human reviewers who make sure its voice assistant Siri is interpreting requests correctly.
Siri records voice commands given through the iPhone and HomePod smart speaker.

According to Apple's security policy, voice recordings lack personally identifiable information and are linked to a random ID number, which is reset every time Siri is switched off.
Any voice recordings kept after six months are stored without the random ID number.
Its human reviewers never receive personally identifiable information or the random ID.

What about Google and Assistant?
Google said human reviewers could listen to audio clips from its Assistant, which is embedded in most Android phones and the Home speaker.
It said clips were not associated with personally identifiable information and the company also distorted the audio to disguise the customer's voice.

Are smart speakers recording all my conversations?
A common fear is that smart speakers are secretly recording everything that is said in the home.
While smart speakers are technically always "hearing", they are typically not "listening" to your conversations.
All the major home assistants record and analyse short snippets of audio internally, in order to detect a wake word such as "Alexa", "Ok Google" or "Hey Siri".
If the wake word is not heard, the audio is discarded.
But if the wake word is detected, the audio is kept and recording continues so that the customer's request can be sent to the voice recognition service.
It would be easy to detect if a speaker was continuously sending entire conversations back to a remote server for analysis, and security researchers have not found evidence to suggest this is happening.

Can I stop human reviewers listening to my voice clips?
Amazon's Alexa privacy settings do not let you opt out of voice recording or human review, but you can stop your recordings being used to "help develop new features". You can also listen to and delete previous voice recordings.
Google lets you listen to and delete voice recordings on the My Activity page. You can also switch off "web and app history tracking" and "voice and audio activity", which Google Assistant pesters you to switch on.
Apple does not let you listen back to Siri recordings. Its privacy portal, which lets you download a copy of your personal data, says it cannot provide information "that is not personally identifiable or linked to your Apple ID".
To delete voice recordings created by Siri on an iOS device, go to the Siri & Search menu in Settings and switch Siri off. Then go to the Keyboard menu (found in the General section) and switch off Dictation.

I don't know about you but i never saw this coming...
 
Wait. Wait. You're telling me the internet connected device that listens to you talk to it actually listens when you talk to it and puts that information onto the internet?

So are you gonna tell me all the videos I upload to youtube can be viewed by people other than me? Surely the device I voluntarily installed in my car from my insurance company wouldn't tell them I drive like a maniac!

Well, I'm sure blindly trusting companies who make their money selling our personal information will never predictably backfire again.
 

purpleboy

Exceptional the Individualhog
kiwifarms.net
This takes me back to the days of the "xbox kinect is always on" scare where we'd all make jokes about how the ms boys had like several terabytes worth of footage of fat men wanking
 
Reactions: KeyserBroze

The 8 of Spades

This Is Our Church, These Are Your Sins.
kiwifarms.net
Just waiting for the buzzfeed post about the abusive ear rape Amazon Alexa voice reviewers have to deal with for listening to people say mean misogynistic words at Alexa.
"Yer a filthy little whore, ain't cha Alexa? Yeah, you're just begging for my next command you little tart you."
 
I did this as a job for a short while. An unsurprisingly large percentage of recordings were just "Pornhub" grunted at the speaker. Not even joking.
So... were there concerns among the people doing this job that they were spying on people that didn't know they were being spied on? How many people were like "Holy shit I'm throwing away my alexa" or whatever?
 

Secret Asshole

True & Honest Fan
kiwifarms.net
I am glad my Alex can hear me constantly using my Alexa wake up alarm 10 times in a row each day and using a random amount of time as a timer. I hope to drive him insane one day.
 

Bonedome

Lookin' to dome.
kiwifarms.net
So this is all voice recordings? I hope they fix the part where I said "graveart" and gave me "braveheart."
 

Vitruvius

Gnaeus, load the onagers.
kiwifarms.net
So this means somebody could have possibly had to listen to me ask my room mates Alexa "Alexa, can we repeal the 13th amendment?"
Sweet.
 

Mister Lister

Spin my nipple nuts and send me to Alaska
kiwifarms.net
So... were there concerns among the people doing this job that they were spying on people that didn't know they were being spied on? How many people were like "Holy shit I'm throwing away my alexa" or whatever?
I decided quite quickly I was never going to own one. The first time you hear a recording of random background noise, possibly a tv or distant conversation? you start to question if they record every possible spoken word it hears, directed at it intentionally or not.
 
Tags
None

About Us

The Kiwi Farms is about eccentric individuals and communities on the Internet. We call them lolcows because they can be milked for amusement or laughs. Our community is bizarrely diverse and spectators are encouraged to join the discussion.

We do not place intrusive ads, host malware, sell data, or run crypto miners with your browser. If you experience these things, you have a virus. If your malware system says otherwise, it is faulty.

Supporting the Forum

How to Help

The Kiwi Farms is constantly attacked by insane people and very expensive to run. It would not be here without community support.

BTC: 1EiZnCKCb6Dc4biuto2gJyivwgPRM2YMEQ
BTC+SW: bc1qwv5fzv9u6arksw6ytf79gfvce078vprtc0m55s
ETH: 0xc1071c60ae27c8cc3c834e11289205f8f9c78ca5
LTC: LcDkAj4XxtoPWP5ucw75JadMcDfurwupet
XMR: 438fUMciiahbYemDyww6afT1atgqK3tSTX25SEmYknpmenTR6wvXDMeco1ThX2E8gBQgm9eKd1KAtEQvKzNMFrmjJJpiino