If There's a Nazi at Your Bar, It's a Nazi Bar

Summary: I found out Substack isn't just tolerating extremism—it's literally hosting Nazi newsletters. So I'm out. Here's why you should care.

"If there are nine people sitting at a table and a Nazi sits down, and nobody gets up, there are ten Nazis at the table." – Unknown

When people first started saying Substack had a Nazi problem, I did what most of us do. I rationalized it. I assumed they were using "Nazi" the way the internet uses it—loosely, carelessly, as a blunt rhetorical weapon. Maybe it was white supremacist adjacent content hiding behind edgy language. Maybe it was dubious newsletters with dogwhistles that linked to actual extremist groups but maintained plausible deniability.

What I did not expect was to find actual, mask-off, unambiguous Nazi content. Full-blown antisemitism. Newsletters celebrating and promoting National Socialism. Not buried in some dark corner—hosted openly, on a platform where I was also publishing my work.

If you don't believe me, go look at what was at natsoctoday1.substack.com. Or read the reporting. It's all there:

That last one is almost darkly funny. Accidentally promoted a Nazi newsletter through push notifications. An algorithm so indifferent to the content it amplifies that it just... pushed Nazism to people's phones. Like a coupon for hate.

This reminds me of something I learned early in my career. When you build software, the defaults matter. The things a platform chooses to allow, the content it monetizes, the guardrails it refuses to install—those aren't neutral decisions. They're architectural choices. A platform that takes a cut from Nazi content isn't a free speech champion. It's a business partner.

I've thought about this the way I think about any system design problem. There's always a tradeoff. Substack made theirs: growth and "openness" over basic human decency. And look, I'm no stranger to messy tradeoffs in tech. But there's a line. And if your platform is literally making revenue from people who celebrate genocide, you've blown past that line at full speed.

So I'm out. I've backed up everything to my blog, where it always should have lived anyway. You can find me on Hachyderm.

The bar metaphor is overused, but it's overused because it's true. If a Nazi sits down at your bar and you serve them, you've made a choice. If you profit from serving them, you've made an even louder one. And I don't want to drink at that bar.

Peace.

Also readable in: https://maho.dev/2026/02/if-theres-a-nazi-at-your-bar-its-a-nazi-bar/ by @mapache:

#Ethics #Tech Culture #Substack #Free Speech #Platform Responsibility #Hate Speech #Internet Culture

Substack Has a Nazi Problem

The newsletter platform’s lax content moderation creates an opening for white nationalists eager to get their message out.

The Atlantic

YouTube’s Appeal Decision Is In: My Inactive Manager Channel Stays Banned—And It’s Complete Bullshit

It's been less than five hours since I woke up to discover my YouTube channels had been terminated overnight, and I've already received YouTube's appeal decision. Spoiler alert: it's not good news. In fact, it's exactly the kind of generic, nonsensical response that proves YouTube's moderation system is running on autopilot with zero human oversight. Let me walk you through what happened today, because the timeline alone shows how broken this entire process is. The Timeline of This […]

https://jaimedavid.blog/2026/01/26/16/04/13/analysis/jaimedavid327/9438/youtubes-appeal-decision-is-in-my-inactive-manager-channel-stays-banned-and-its-complete-bullshit/

First: as a parent myself to two daughters under 16, it removes a daily battle parents are losing.
Once a child’s peer group is online, individual rules collapse under social pressure.
A national rule levels the field — parents aren’t the villain anymore. 2)
#TechPolicy

#PlatformResponsibility

#Regulation

#PublicPolicy

Forigato Blow Lyrics Exposed as Racist Hate

YouTube

A tragic wrongful death lawsuit against Roblox and Discord. A teen's suicide, alleged predatory interactions, and a platform facing claims of responsibility.

This isn't just a legal battle; it's a stark reminder that 'user safety' needs to be at the core of platform design, not an afterthought. Are we truly building safe digital worlds for our youngest users?

Read more: https://www.engadget.com/gaming/roblox-hit-with-wrongful-death-lawsuit-following-a-teen-players-suicide-201501296.html?src=rss

#OnlineSafety #TechEthics #Roblox #DigitalWellbeing #PlatformResponsibility

Roblox hit with wrongful death lawsuit following a teen player's suicide

According to The New York Times, a mother has taken legal action against Roblox and Discord for their alleged involvement in her son's death.

Engadget
National Assembly meeting calls for fair trading of unlisted stocks and increased platform responsibility to protect investors and improve market transparency
#YonhapInfomax #UnlistedStocks #FairTrading #InvestorProtection #PlatformResponsibility #CapitalMarketReform #Economics #FinancialMarkets #Banking #Securities #Bonds #StockMarket
https://en.infomaxai.com/news/articleView.html?idxno=64406
Unanimous Call for 'Fair Trading of Unlisted Stocks' at National Assembly Meeting - Platform Responsibility Also Discussed

National Assembly meeting calls for fair trading of unlisted stocks and increased platform responsibility to protect investors and improve market transparency

Yonhap Infomax

Section 230 of the Communications Decency Act: AI’s New Frontier #AIRegulation

Hashtags: #AIregulation #internetlaw #platformresponsibility Summery: The article discusses the emerging battle front regarding Section 230 of the Communications Decency Act and how it will treat generative AI content. Section 230 is a controversial law that protects online platforms from liability over posted content. Generative AI is an exciting innovation that has captured the imagination…

https://webappia.com/section-230-of-the-communications-decency-act-ais-new-frontier-airegulation/

Section 230 of the Communications Decency Act: AI's New Frontier #AIRegulation

Hashtags: #AIregulation #internetlaw #platformresponsibility Summery: The article discusses the emerging battle front regarding Section 230 of the Communications Decency Act and how it will treat generative AI content. Section 230 is a controversial law that protects online platforms from liability over posted content. Generative AI is an exciting innovation that has captured the imagination of

Webappia