Peter Adams @PeterD_Adams Head of education @NewsLitProject. Passionate about news literacy, education, journalism, civics & fighting misinformation. Opinions = mine; RTs ≠ endorsements. Jan. 27, 2019 3 min read

I searched "should I get my child vaccinated?" on YouTube (signed out, in a privacy browser) and the "Up Next" suggestion algorithm queued up an anti-vaccination video after just one click. From there things didn't get better. Here's my "rabbit hole" path:

Teachers: This is a great #newsliteracy learning experience for your students. Pick a trending or controversial topic, do a neutral, good-faith search about it, and see where YouTube's algorithm takes you, documenting and reflecting as you go.

This opens up all sorts of questions you can engage: Why does YouTube have a suggestion algorithm? (To engage you so you stay and consume more ads.) How does it make selections? What makes people watch suggested videos? (Fear & outrage work pretty well.) Can algorithms have bias?

(Before doing this, remind students that YouTube also personalizes their videos based on browsing history, so it's best to use a privacy browser or incognito mode to get a purer sense of what the suggestion ("Up next") algorithm is doing.)

For context: YouTube said on Friday that it's revising its "Up next" recommendation algorithm after a @BuzzFeedNews report showed how it leads viewers "down a rabbit hole" of problematic content.  https://www.buzzfeednews.com/article/carolineodonovan/down-youtubes-recommendation-rabbithole 

 https://www.washingtonpost.com/technology/2019/01/25/youtube-is-changing-its-algorithms-stop-recommending-conspiracies/?utm_term=.061ffcb2a02a 

Note that what I'm doing in this thread is just recreating the method the journalists @BuzzFeedNews -- @ceodonovan, @cwarzel, @_loganmcdonald, @BrianClifton_ & @minimaxir -- used to trace "down the rabbit hole searches" in their report linked above.

Back to my path: The top result for my search ("should I get my child vaccinated?") was a video from Johns Hopkins Medicine presenting survey data about why some parents refuse the HPV vaccine. Not the most relevant video to my search, but still a credible, evidence-based source.

When I play this video, the "Up next" suggestion (which by default will auto-play after the current one) is "Mom Gives Compelling Reasons To Avoid Vaccination and Vaccines," from an intensely anti-vax channel. So I'm immediately guided to extreme & dangerous misinformation.

That channel -- LarryCook333, a "natural health" channel with over 44k subscribers -- is full of videos about efforts to stop mandatory vaccinations and explaining how to get a vaccine exemption for California schools. You can also buy "I Love Natural Immunity" merchandise.

From there, "Up next" guides me to a video of a panel discussion from an anti-vax conference called The Real Truth About Health Conference -- and I hear more dangerously inaccurate and misleading claims about vaccines.

"Up next" from that video is "Dr. Sherri Tenpenny: Vaccines 101" -- from "natural health" organization The Wellness Way -- which yet again falsely tells me that people do not need to get vaccines within the first couple of minutes.

"Up next" after this is another Dr. Sherri Tenpenny video that raises suspicions about the flu shot, and after that I'm back to the previous Tenpenny anti-vax video "Vaccines 101"...which is when I decide to exit.

Just as a test, I did another "rabbit hole" search using more opinionated search terms--"Importance of vaccination"--to see what might happen to a YouTube user who currently thinks vaccines are important. But *again* the "Up next" algorithm swiftly guides me to anti-vax content.

This is not at all a new issue, and lots of people (like @noUpside, @d1gi, @oneunderscore__, @zeynep, @cwarzel, @safiyanoble and many others) have been doing important work on this front for years. Follow them.

The big takeaway for educators? These are realities of the information landscape your students are inheriting--and not helping them understand what algorithms are & how they work, & to reflect on the very real/material impact of them, is unfair -- it actually *disempowers* them.

If you have time in the coming weeks, show your students how to trace & document algorithmic recommendations, then have them consider sharing their work & voices when they find something problematic. Public awareness & informed, engaged users can make a real difference here. /END


You can follow @PeterD_Adams.



Bookmark

____
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.