THE INCREDIBLE REASON HOUSE STATION LIVE CHANGED SERVERS
Or a survival guide for e-businesses facing bans or legal disputes with a platform

For nearly two years, we watched our entire platform become invisible. Not because of bad content, policy violations, or lack of effort... but because of a silent algorithmic shadowban. We had no warnings, no appeals, and no answers. Worse: under YouTube’s terms of service, any legal dispute must be handled in a U.S. court (even if you're based in France and pay taxes there). This is how global platforms sidestep national laws... and why creators are left legally unprotected in their own countries.
¯

_
PART ONE – WHEN A SHADOWBAN SHUTS DOWN YOUR BUSINESS
¯

_
Two years ago, we left Dedibox, a French hosting company we judged incapable of meeting even our most basic expectations in terms of customer service. In a field as critical as data hosting, the professionalism of the technical support team cannot be optional... it must be the company's showcase, the reassuring human face you turn to when something goes wrong. This pursuit of reliability led us to GoDaddy, based in Arizona, whose configuration tools, WordPress diagnostics, interface design, and especially their technically skilled support team had earned our trust... far beyond the empty promises of typical commercial discourse. But everything collapsed suddenly, swept away by a digital catastrophe we didn’t see coming. A brutal, invisible blow: the shadowban. House Station Live was ghosted (to use the terminology of our virtual assistant, GPT). Disappeared from search results, ignored by YouTube recommendations, erased from the Android Play Store. For eighteen months, despite heavy investments and extensive testing in formats, lengths, languages, thumbnails, titles, even hosts, nothing changed. Every video was locked between 20 and 30 views. We were trapped in that narrow range, with no human contact, no way to file a complaint, and no hope of improvement.

Facing this algorithmic wall, we made the only logical decision: open an investigation and build a legal case. Not to prove a “perfect crime” but to demonstrate that even the most opaque algorithms leave traces. During this inquiry, we came across a particularly disturbing fact: according to YouTube’s terms of use, any dispute must be brought before a U.S. judge. It doesn’t matter that you are based in France, targeting a French audience, or that French law requires foreign companies to have a legal presence in the country... Google circumvents this by distinguishing between headquarters, local offices, and legal jurisdiction. The result is clear: you are automatically excluded from the protection of your own legal system. This system is so airtight that very few individuals or businesses attempt legal action against Google. The GAFAM is protected by a lethal triad: algorithmic opacity, extraterritorial legal shielding, and the complicity of a U.S. government that views tech giants as national pride (even strategic weapons in the global information war). While France leaves its citizens exposed and helpless against digital abuse, the United States has conquered the Internet on a global scale by imposing its law as if it were sovereign territory.

To illustrate just how absurd and dangerous this has become, let’s take the example of music licensing. Every month, House Station Live pays royalties to SACEM, the French government’s music rights agency. In return, we are legally authorized to broadcast commercial works, provided we submit monthly playlists so that royalties can be fairly distributed to artists. In theory, everything is legal and in order. But the United States has its own system: the DMCA. And if you stream House Station Live through any platform based in the U.S. (like GoDaddy, YouTube, etc.), you are automatically subject to U.S. law, even if your legal entity is based in France. France, in turn, declares itself incompetent in such cases because the “crime scene” is legally located on American soil, where the servers are hosted. So the SACEM fee we pay offers zero protection, neither domestically nor abroad... where we’re treated like pirates. Imagine buying a product from a foreign website: you pay the foreign VAT, a currency conversion fee, and then the French customs tax. Three layers of taxation. A 30 € item ends up costing you 150 €. That’s digital over-taxation. And the same applies to our royalties.

Worse still, the U.S. considers you to be operating on their soil the moment your server is physically located there... regardless of where you are based, where your company is registered, or what contracts you’ve signed with your local rights agencies. Even if your SACEM contract is supposedly international, it offers you no protection in this skewed legal context. The U.S. has simply annexed the Internet, claimed it as their jurisdiction, and imposed their extraterritorial laws on the rest of the world (without any international mandate or global consent).
¯

_
||#HSLdiary #HSLpartners

#Shadowban #Censorship #YouTube #DMCA #DigitalRights #FrenchTech #AlgorithmBias #GoogleAbuse #PlatformAbuse #Justice

The Algorithm Doesn’t Just Entertain—It Shapes Belief

🎧 Listen wherever you get your podcasts
🎙️➡️ https://youtu.be/35Nj0Y53Puo

#SocialMedia #Radicalization #AlgorithmBias #DigitalLiteracy #OnlineSafety #Fediverse #TheInternetIsCrack

#27 Algorithms and AI, Oh My!

YouTube

🧠 Free speech isn’t the debate—Free will is.

We’re being programmed by algorithms that shape our interests and biases.
Our free agency is at stake as these discovery mechanisms control what we see.

💡 The solution? Give people choice over the algorithms they use.
Imagine being able to choose or build your own algorithm from a source you trust.

#NomadFoundr #FreeWillVsAlgorithm #DigitalAgency #AlgorithmBias #TechEthics

Paris cyber prosecutors open probe into #X over alleged algorithmic distortions, following a complaint from an MP on January 12 about biased algorithms. ⚖️ #TechNews #AlgorithmBias #DataPrivacy #CyberSecurity #SocialMedia #France #TechLaw #BiasInTech
Paris cyber prosecutors open probe into #X over alleged algorithmic distortions, following a complaint from an MP on January 12 about biased algorithms. ⚖️ #TechNews #AlgorithmBias #DataPrivacy #CyberSecurity #SocialMedia #France #TechLaw #BiasInTech
TikTok’s algorithm exhibited pro-Republican bias during 2024 presidential race, study finds

A study found TikTok's algorithm recommended more Republican-aligned content during the 2024 US election. Republican accounts saw more like-minded content, while Democratic accounts were shown more opposing views. This suggests a pro-Republican skew.

PsyPost

📢 Our next #NetworkInequality lecture explores how node ranking algorithms impact #minority groups' #visibility in search & recommendation systems.

🎤 Ana-Andreea Stoica from Max Planck Institute for Intelligent Systems
📅 July 26 (Friday)
🕒 3pm

Register to all talks: http://bit.ly/LSNI-2024
Info: https://networkinequality.com/lecture-series

@CSHVienna #DataScience #AlgorithmBias

Welcome! You are invited to join a meeting: [LSNI@CSH] Improving the visibility of minorities through network growth interventions by Leonie Neuhäuser. After registering, you will receive a confirmation email about joining the meeting.

Abstract: Improving the position of minority groups in networks through interventions is a challenge of high theoretical and societal importance. However, a systematic analysis of interventions that alter the network growth process is still missing. In this work, we propose a model to examine how network growth interventions impact the position of minority nodes in degree rankings over time. We distinguish between (i) group size interventions, such as introducing quotas; and (ii) behavioural interventions, such as varying how groups connect to each other. We find that even extreme quotas do not increase minority representation in rankings if the actors in the network do not adopt homophilic behaviour. Thus, interventions need to be coordinated in order to improve the visibility of minorities. In a real-world case study, we explore which interventions can reach gender parity in academia. Our work provides a theoretical and computational framework for investigating the effectiveness of interventions in growing networks. https://www.nature.com/articles/s42005-023-01218-9.pdf Bio: Passionate about people and interdisciplinary projects, I use my background in computer science, mathematics and psychology to tackle societal challenges with a human-centric approach. I am currently working at Amprion, one of Germany's transmission system operators, on the development and implementation of their data strategy. Previously, I was a doctoral researcher in the Computational Network Sciences Group at RWTH Aachen University. My interests include Network Science and Complex Systems, Computational Social Sciences and Graph Neural Networks, investigating group effects in networks and their impact on diversity and fairness in algorithmic decision making. https://leonieneuhaeuser.netlify.app https://twitter.com/leoneuhaeuser --- >>This is the last talk of the NetIn Lecture Series 2024<< Previous Talks: https://bit.ly/LSNI-2024-videos

Zoom

Considering the amount of far right content that keeps making its way into my YouTube recommendations, I now understand how so many people end up falling down the path of hate.

YouTube needs to do better and moderate its platform.

I’m especially sickened by the sheer amount of anti-trans content that enters my YouTube recommendations. Videos that outright suggest that people assault transgender individuals.

Yesterday, I saw a video from a literal Nazi. His content was promoted and had a large number of views. He engages in marches around cities, urging others to join him in his hate campaigns. This really shouldn’t be sent to my front page, or anyone’s, for that matter.

I created an alternate account on another device, on another network. This account had no links to my own and a fresh install of its OS. I received the same content nearly immediately. I found YouTube’s Shorts feature pushed it particularly hard.

I also noticed that YouTube’s “news” tab promoted Sky News above all other broadcasters.

News arguably shouldn’t be treated differently on the platform at all, but to promote a network with such bias and prejudice above all others is inexcusable.

Remember not to engage with this content. Move on as soon as you realise what it is. Don’t dislike it. Don’t comment condemning it. Move on.

The more you interact, the more is promoted.

YouTube, you must do better. This isn’t acceptable. Stop this hate bait and get it off the platform.

#YouTube #News #SkyNews #Google #Nazi #FarRight #YouTubeCensorship #OnlineSafety #AntiHate #SocialResponsibility #TransRights #MediaBias #InternetRegulation #ContentModeration #DiversityAndInclusion #SocialJustice #TechAccountability #HateSpeech #OnlineActivism #Equality #AlgorithmBias #DigitalCitizenship #FairMedia

Streaming now (and live in North Carolina): #CiscoSystems and #OASISOpen's AI Security Summit. Up now: #Fidelity Investments' Melinda Thielbar, VP, Data Science, on data fairness testing in AI:
"In this session, we’ll introduce a state-of-the-art method for calculating fairness tests without collecting personal data on individuals and demonstrate its implementation in Jurity, an open source fairness testing package maintained by Fidelity…"
http://fidelity.github.io/jurity

https://aisecuritysummit.org#AIsecurity #algorithmbias
#datascience
Jurity: Fairness & Evaluation Library — Jurity 1.3.2 documentation

Twitter admits bias in algorithm for rightwing politicians and news outlets

Home feed promotes rightwing tweets over those from the left, internal research finds

The Guardian