The LLM grammar fixer I use acts silly in that doesn't seem to know about threat models. It may correct "threat model" to "threaten the model" or "threaten to model", neither of which I would ever do but it's fun trying to imagine.
@lmk Heh, maybe you already dealt with it, but does it not even have a way to add your own phrases (like a spell checker dictionary)? Maybe just change the system prompt of the LLM. ("Also legitimate phrases include 'threat model'", etc.)
@headmold Thanks but I don't think so: it's Gmail compose (moving to an LLM would be too much copy/paste). I assume that's a local model (small and in JS?) it's so fast so it doesn't have big vocabulary, but just guessing - and pretty sure no way to know (other than if it's local which would require real work I bet).

@lmk Yeah I should've remembered you've dorked with LLMs enough to do those things if you could.

I've mostly just stopped paying attention to the grammar checkers because they don't like my idioms or purposeful stretches of grammar. Today I shut one up because it didn't like fixing something "in post".

@headmold Come to think of it, with the internet increasingly dominated by huge corporations wielding centralized mega-services (search, social, ...) they probably prefer one-size-fits-all (a lot less work) so it could take a long time before power is yielded to the individual.
@lmk Well, you could run your own. While I wouldn't recommend Grammarly, they've been doing this in browser extensions for ages, so doing something similar with a local LLM ought to be possible.
@headmold (trying to follow, not complaining here, but replying since you were kind to reply) from a browser extension all I can think of is via localhost port to reach a local LLM, and I'd like this to work outside the browser (LibreOffice, emacs etc.) but I know this is delusional expectation of software integration. As I said, not holding my breath. Thanks!
@lmk (No worries at all, though it might be better for chat or email eventually.) Yeah, localhost connection is what I was thinking. Given everything you described is or can be open source and there are already RPC APIs for LLM servers, this at least seems plausible. Google's LLM (ha) even suggested some things for the browser and emacs (too many to list here, and I haven't tried any of those tools).