YouTube announced they will stop recommending some conspiracy theories such as flat earth.
I worked on the AI that promoted them by the *billions*.
Here is why it’s a historic victory. Thread. 1/
Brian is my best friend’s in-law. After his dad died in a motorcycle accident, he became depressed. He fell down the rabbit hole of YouTube conspiracy theories, with flat earth, aliens & co. Now he does not trust anyone. He stopped working, seeing friends, and wanting kids. 2/
Brian spends most of his time watching YouTube, supported by his wife.
For his parents, family and friends, his story is heartbreaking.
But from the point of view of YouTube’s AI, he’s a jackpot.
We designed YT’s AI to increase the time people spend online, because it leads to more ads. The AI considers Brian as a model that *should be reproduced*. It takes note of every single video he watches & uses that signal to recommend it to more people 4/
How many people like Brian are allured down such rabbit holes everyday day?
By design, the AI will try to get as many as possible.
Brian's hyper-engagement slowly biases YouTube:
1/ People who spend their lives on YT affect recommendations more
2/ So the content they watch gets more views
3/ Then youtubers notice and create more of it
4/ And people spend even more time on that content. And back at 1
This vicious circle was also observed with http://tay.ai , and it explains why the bot became racist in less than 24 hours.
=> Platforms that use AIs often get biased by tiny groups of hyper-active users.
Example of YT vicious circle: two years ago I found out that many conspiracies were promoted by the AI much more than truth, for instance flat earth videos were promoted ~10x more than round earth ones 🌎🤯
I was not the only one to notice AI harms. @tristanharris talked about addiction. @zeynep talked about radicalization. @noUpside, political abuse and conspiracies. @jamesbridle, disgusting kids videos. @google's @fchollet, the danger of AI propaganda:
Since then many newspapers spoke about AI harms, as for instance: @wsj @guardian @nytimes @BuzzFeed @washingtonpost @bloomberg @huffpost @dailybeast @vox @NBCNews @VICE @cjr @techreview
There are 2 ways to fix vicious circles like with "flat earth"
1) make people spend more time on round earth videos
2) change the AI
YouTube’s economic incentive is for solution 1).
After 13 years, YouTube made the historic choice to go towards 2)
Will this fix work? 11/
The AI change will have a huge impact because affected channels have billions of views, overwhelmingly coming from recommendations. For instance the channel secureteam10 made *half a billion* views with deceiving claims promoted by the AI, such as:
Note that #secureteam10 was the most liked channel of Buckey Wolfe, who came to believe his brother was a “lizard” and killed him with a sword.
To understand how he fell down the rabbit hole, see his 1312 public likes here:
This AI change will save thousands from falling into such rabbit holes
(If it decreases between 1B and 10B views on such content, and if we assume one person falling for it each 100,000 views, it will prevent 10,000 to 100,000 "falls") 14/
A concern remains that other rabbit holes are arising. I created http://algotransparency.org to identify and monitor harmful content recommended by the AI.
Conclusion: YouTube's announcement is a great victory which will save thousands. It's only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable.
If you see something, say something.
You can follow @gchaslot.
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.
Enjoy Threader? Sign up.
Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.