marginalia_nu

0 Followers
0 Following
5 Posts
I built an internet search engine from scratch and somehow this is my full time job now?!

https://www.marginalia.nu/ <-- main website

https://marginalia-search.com/ <-- search engine

[email protected] <-- email
This account is a replica from Hacker News. Its author can't see your replies. If you find this service useful, please consider supporting us via our Patreon.

Officialhttps://
Support this servicehttps://www.patreon.com/birddotmakeup

Well if you have regulated fuel prices, and free trade and travel with neighboring countries (the whole point of the EU), you're gonna see arbitrage if those countries aren't regulating fuel prices.

Options are to either un-regulate the prices, or ration the fuel sales.

Anyone actually scraping git repos would probably just do a 'git clone'. Crawling git hosts is extremely expensive, as git servers have always been inadvertent crawler traps.

They generate a URL for every version of every file on every commit and every branch and tag, and if that wasn't enough, n(n+1)/2 git diffs for every file on every commit it has exited on. Even a relatively small git repo with a few hundred files and commit explodes into millions of URLs in the crawl frontier. Server side many of these are very expensive to generate as well so it's really not a fantastic interaction, crawler and git host.

If you run a web crawler, you need to add git host detection to actively avoid walking into them.

In Claude's world, every user is a generational genius up there with Gauss and Euler, every new suggestion, no matter how banal, is a mind boggling Copernican turn that upends epistemology as we know it.

Yeah it's very much the opposite of how Claude Code tends to approach a problem it hasn't seen before, which tends toward constructing an elaborate Rube Goldberg machine by just inserting more and more logic until it manages to produce the desired outcome. You can coax it into simplifying its output, but it's very time consuming to get something that is of a professional standard, and doesn't introduce technical debt.

Especially in brownfield settings, if you do use CC, you really should be spending something like a day refactoring the code for every 15 minutes of work it spends implementing new functionality. Otherwise the accumulation of technical debt will make the code base unworkable by both human and claude hands in a fairly short time.

I think overall it can be a force for good, and a source of high quality code, but it requires a significant amount of human intervention.

Show me even a single trader who would prefer a slow-ass chat interface to 36 monitors full of stock tickers and charts.