Show HN: I put an AI agent on a $7/month VPS with IRC as its transport layer

https://georgelarson.me/writing/2026-03-23-nullclaw-doorman/

building a digital doorman

A lightweight AI agent that answers questions about my work by reading actual code. Architecture, security, and model selection decisions.

The model used is a Claude model, not self-hosted, so I'm not sure why the infrastructure is at all relevant here, except as click bait?

We need more infra in the cloud instead of focusing on local RTX cards.

We need OpenRunPods to run thick open weights models.

Build in the cloud rather than bet on "at the edge" being a Renaissance.