Always ask what training data the AI learned from. Here,
- 16 men and 8 women (probably young & non-disabled)
- pretending to walk with emotion
- auto-transformed to stick figures
- and then 'true' emotions labeled by remote MTurk workers
- ignoring actor's intended emotion
And what about training data for AIs used to identify potential aggression? Or to analyze videos of job candidates?
Every time I see stuff like this I think of CIMON annoying an astronaut with its emotion detection. https://qz.com/1482839/the-iss-has-a-robot-on-board-and-hes-being-kind-of-a-dick/ …
I'm so excited to see this application happen though:
You can follow @JanelleCShane.
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.
Enjoy Threader? Sign up.
Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.