New side project: I'm making WebBee 🙈

It is a conversational web-dev agent – one that doesn't burn the planet or steal other people's code. They can turn English language into structured English language. Add components, install packages, scaffold pages, run framework CLIs. All that without requiring a ton of RAM or scraping open source :)

WebBee is a little silly, but fully deterministic, and you can teach them routine tasks.

It doesn't use LLMs but a neural network for intent classification, based on supervised learning.

This is way less resource-wasting.

@lea What library are you using for the network?
@drwho right now I'm dabbling with torch. Evaluating if I can ditch it :)

@drwho the network architecture itself is using a transformer (the same thing LLMs use) but only one layer (look for keywords/pronouns and the relations to other keywords).

I was able to ditch torch in the runtime so it is pure c++ and straightforward to run in the browser as a wasm module. What's still there is pytorch used for generating the model.

@drwho A former approach used a BiGRU (bidirectional gated recurrent unit). It is more lightweight. Worked also well but not 100% accurate and failed for more complex sentences. A biGRU iterates the sentence from left to right and from right to left to look for keywords. The transformer looks at every word in parallel and is able to find relations to other words and can find out where "he/she/it" relates to.
@drwho bigger networks (the ones burning the planet and scraping the whole internet) work similar, but use multiple layers and analyze multiple aspects of a sentence this way.
@lea Hmmm... Do you think it would be useful to try as a command parser?