I installed a locally hosted LLM using @simon's excellent `llm` (https://github.com/simonw/llm) tool. It's kind of wild that I just...have this power on my laptop?
GitHub - simonw/llm: Access large language models from the command-line

Access large language models from the command-line - simonw/llm

GitHub

@ken I'm constantly amazed at how much information is compressed into that ~13GB Llama 2 7B model

(It hallucinates wildly too, but still very impressive)

@simon @ken I asked it about myself and it said I was the founder of @ProPublica and a Pulitzer winner. But it’s fascinating that it linked me to the company!

@ken @ProPublica nice, congrats on the Pulitzer!

I love asking the smaller models about myself, all sorts of weird stuff shows up https://simonwillison.net/2023/May/1/lets-be-bear-or-bunny/

Let’s be bear or bunny

The Machine Learning Compilation group (MLC) are my favourite team of AI researchers at the moment. In less than two months they’ve released: Web Stable Diffusion—the full Stable Diffusion image …

@simon @ken note however that 13GB would also let you store many many Web pages!