Got these two most awful yt reccommendations when watching the 50501 protest livestreams.
Got these two most awful yt reccommendations when watching the 50501 protest livestreams.
:sigh: Just turn off watch history and search history, clear both, and either search for what you want or subscribe and use the subscriptions page.
My YT homepage is blank and glorious.
Nothing good will ever come from suggestion algorithms.
There was a TED talk by Zeynep Tufekci in 2017 (“We’re building a dystopia just to make people click on ads”) – (YouTube*: www.youtube.com/watch?v=iFTWM7HV2UI) that briefly talks about this:
(*I’m aware of the irony in linking there)
So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. I study social movements, so I was studying it, too. And then I wanted to write something about one of his rallies, so I watched it a few times on YouTube. YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one, too. If you watch Hillary Clinton or Bernie Sanders content, YouTube recommends and autoplays conspiracy left, and it goes downhill from there.
Well, you might be thinking, this is politics, but it’s not. This isn’t about politics. This is just the algorithm figuring out human behavior. I once watched a video about vegetarianism on YouTube and YouTube recommended and autoplayed a video about being vegan. It’s like you’re never hardcore enough for YouTube.
So what’s going on? Now, YouTube’s algorithm is proprietary, but here’s what I think is going on. The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they’re more likely to stay on the site watching video after video going down that rabbit hole while Google serves them ads.
These days it might also be about politics, but the motivation to capture attention to serve ads is still the priority.

I have my search and watch history off and I watch recipes and music videos. The right panel with recommendations is always filled with right-wing videos. Even if opened in an incognito tab.
It’s not the algorithm… It’s how YouTube have been prepared on purpose.
It’s not the algorithm… It’s how YouTube have been prepared on purpose.
It is an algorithm that was designed in a way that promotes that kind of shit because on average it keeps people watching videos (and therefore ads). They use the data from signed in people to determine wha to shove in the face of a new user.
I’m guessing it’s because I just don’t watch political content on YouTube at all (for this reason), but I actually don’t get any political recommendations right or left. It’s very much just project, gaming, and photography channels and the only time I see suggestions outside that, it’s because I specially strayed from my usual.
I stick to Lemmy for political stuff simply because it doesn’t affect my other algorithms on any other platforms.

Kiwix is a nonprofit organisation making free knowledge accessible where the Internet is not. We create and support open technologies that bring the world’s knowledge Offline via our own open-source software dedicated to providing offline access to free educational content, and more…
Click the 3 dots on the right, select “do not recommend channel”.
It works better than I expected.