Always ask what training data the AI learned from. Here,
- 16 men and 8 women (probably young & non-disabled)
- pretending to walk with emotion
- auto-transformed to stick figures
- and then 'true' emotions labeled by remote MTurk workers
- ignoring actor's intended emotion
And what about training data for AIs used to identify potential aggression? Or to analyze videos of job candidates?
Every time I see stuff like this I think of CIMON annoying an astronaut with its emotion detection. https://qz.com/1482839/the-iss-has-a-robot-on-board-and-hes-being-kind-of-a-dick/ …
I'm so excited to see this application happen though:
You can follow @JanelleCShane.
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.
Enjoy Threader? Sign up.
Since you’re here...
... we’re asking visitors like you to make a contribution to support this independent project. In these uncertain times, access to information is vital. Threader gets 1,000,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Your financial support will help two developers to keep working on this app. Everyone’s contribution, big or small, is so valuable. Support Threader by becoming premium or by donating on PayPal. Thank you.