TikTok pushed far-right AfD party on young voters in Germany
TikTok pushed far-right AfD party on young voters in Germany
They maybe the only major party who talk what youth might want. Most other party almost ignore youth.
But then AFD basically reviving the “Hitlerjugend” (hitler youth), so that is that.
what youth might want
A pure aryan skull shape?
Oohhhhh noooooooooo The Youth aren’t getting what they Waaaaaant! Politics is bad!
Ooooh the Huge Manatee!!
Yes, they do not want one thing. But that is what they appeal to.
Talking to the emotion of teen, completely contradicting themself, just to get some popularity point. I think this is called popularism, but not sure.
Also afd is extremely radical, so that fit.
Not saying they successful. They just seem like only party that actually try to talk to teen on their level. Still complete bullshit though.
I hope other party start taking teen more seriously.
Chinese company bytedance tries to fragilise western democracies episode 1937392
It’s not China actively manipulating things. It’s the algorithm that was getting gamed by the right wing. It’s not like the Cambridge Analytica scandal where Facebook was working directly with companies that were trying to get Trump elected. It’s more like the pipeline on YouTube, where it’s just algorithms funneling people into what’s popular ends up being gamed by conservative people. In the US, Tik Tok was known for helping the “woke left”, so obviously it’s not the same big conspiracy controlling both.
The thing is, it can also be used for good in that it can show people things like what’s happening in Palestine without being censored by the US, like other US controlled social media has been.
I am a fairly radical leftist and a pacifist and you wouldn’t believe the amount of hoo-ra military, toxic masculinity, explosions and people dying, gun-lover bullshit the YouTube algorithm has attempted to force down my throat in the past year. I block every single channel that they recommend yet I am still inundated.
Now, with shorts, it’s like they reset their whole algorithm entirely and put it into sensationalist overdrive to compete with TikTok.
I am a fairly radical leftist and a pacifist and you wouldn’t believe the amount of hoo-ra military, toxic masculinity, explosions and people dying, gun-lover bullshit the YouTube algorithm has attempted to force down my throat in the past year. I block every single channel that they recommend yet I am still inundated.
I really want to know why their algorithm varies so wildly from person to person, this isn’t the first time I’ve seen people say this about YT.
But in comparison, their algorithm seems to be fairly good in recommending what I’m actually interested in and none of all that other crap people always say. And when it does recommend something I’m not interested in, it’s usually something benign, like a video on knitting or something.
None of this out of nowhere far right BS gets pushed to me and a lot of it I can tell why it’s recommending me it.
For example my feed is starting to show some lawn care/landscaping videos and I know it’s likely related to the fact I was looking up videos on how to restring my weed trimmer.
I think it depends on the things you watch. For example, if you watch a lot of counter-apologetics targeted towards Christianity, YouTube will eventually try out sending you pro-Christian apologetics videos. Similarly, if you watch a lot of anti-Conservative commentary, YouTube will try sending you Conservative crap, because they’re adjacent and share that “Conservative” thread.
Additionally, if you click on those videos and add a negative comment, the algorithm just knows you engaged with it, and it will then flood your feed with more.
It doesn’t care what your core interests are, it just aims for increasing your engagement by any means necessary.
Maybe it depends on what you watch. I use Youtube for music (only things that I search for) and sometimes live streams of an owl nest or something like that.
If I stick to that, the recommendations are sort of OK. Usually stuff I watched before. Little to no clickbait or random topics.
I clicked on one reaction video to a song I listened to just to see what would happen. The recommendations turned into like 90% reaction videos, plus a bunch of topics I’ve never shown any interest in. U.S. politics, the death penalty in Japan, gaming, Brexit, some Christian hymns, and brand new videos on random topics.
That does nothing for what’s recommended.
‘Not Interested’ -> 'Tell us why -> ‘I don’t like the video’ is what works.
Same here. I’ve never watched anything like that, yet my recommendations are filled with far-right grifters, interspersed with tech videos that do interest me.
YouTube seems to think I love Farage, US gun nuts, US free speech (to be racist) people, anti-LGBT (especially T) channels.
I keep saying I’m not interested, yet they keep trying to convert me. Like fuck off YouTube, no I don’t want to see Jordan Peterson completely OWNS female liberal using FACTS and LOGIC
Dunno what you watch but yt may thinks the videos are about similar topics (even if you think otherwise).
For myself I usually get recommended what I already watch: Tech, vtuber/anime, mechanical engineering, oddities (like Weird Explorer, Technology Connections).
I rarely get stuff outside of the bubble like gun videos (some creator recently modified a glock in the design of milwaukee tools), meme channels etc.
I highly suspect the videos you watch and interact heavily may feed back to yt in a different way than you think.
And remember: Negative feelings provoke interactions and increase the session time which is a plus for them.
It’s both.
The companies who do this shit absolutely need to be regulated way more aggressively.
TikTok (ByteDance), being a Chinese company based in the PRC, is compelled to operate in partnership with the CCP by law, which gives the CCP an insane degree of visibility and control into their systems. I would be absolutely unsurprised to find that the CCP is compelling them to tweak their algorithms and push specific content to specific audiences, in addition to the data gathering they’re surely engaged in. Source: I work for an oncology biotech, and we halted our Chinese efforts because there was apparently no legal way to square the circle with regard to data privacy/HIPAA considerations.
Did anyone other than me actually read the whole article? These comments sorta read like the answer is no.
The researchers say that their findings prove no active collaboration between TikTok and far-right parties like the AfD but that the platform’s structure gives bad actors an opportunity to flourish. “TikTok’s built-in features such as the ‘Others Searched For’ suggestions provides a poorly moderated space where the far-right, especially the AfD, is able to take advantage,” Miazia Schüler, a researcher with AI Forensics, tells WIRED.
A better headline might have been “TikTok algorithm gamed by far-right AfD party in Germany”, but I doubt that would drive as many clicks.
For more info, check out this article: Germany’s AfD on TikTok: The political battle for the youth