Claude Code's source code appears to have leaked: here's what we know

https://lemmy.world/post/44999075

Claude Code's source code appears to have leaked: here's what we know - Lemmy.World

Lemmy

I mean it’s not that big a deal. However, it would another thing if the model itself leaked. Now that would be something.

edit: Like I thought, it turns out to be a TS wrapper with more internal prompts. The fireship video is really funny, they use regex to detect if the user is angry 😭

Tragic mistake... Anthropic leaks Claude’s source code

YouTube
Tool usage is very important. Qwen3.5 (135b) can already do wonderful things on OpenCode.

I dabble in local AI and this always blows my mind. How do people just casually throw 135b parameter models around? Are people like, renting hardware or GPU time or something, or are people just building personal AI servers with 6 5090s in them, or are they quantizing them down to 0.025 bits or what? what’s the secret? how does this work? am I missing something? like the Q4 of Qwen3.5 122B is between 60-80GB just for the model alone. That’s 3x 5090s minimum, unless I’m doing the math wrong, and then you need to fit the huge context windows these things have in there too. I don’t get it.

Meanwhile I’m over here nearly burning my house down trying to get my poor consumer cards to run glm-4.7-flash.

Qwen3.5 is a mixture of experts model, so only a part of it needs to be loaded to VRAM, rather than the whole thing. The rest can just sit in RAM.