MIT Technology Review @techreview A media company making technology a greater force for good. Get our journalism: May. 22, 2019 1 min read

Your Amazon Echo and Siri voice assistants are helping to reinforce harmful stereotypes, according to a new UN report. 

The report features an entire section on the responses to abusive and gendered language. 

If you say “You’re pretty” to an Amazon Echo, its Alexa software replies, “That’s really nice, thanks!”

Google Assistant responds to the same remark with “Thank you, this plastic looks great, doesn’t it?”

The assistants almost never give negative responses or label a user’s speech as inappropriate, regardless of its cruelty, the study found.

The report is titled “I’d blush if I could,” after a response Siri gives when someone says, “Hey Siri, you’re a slut.” 

You can follow @techreview.


Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.