Ich teste gerade bisschen mit goose und lmstudio rum. Wenn ich das mit den sub-agents gut hinbekomme, eröffnet mir das ganz neue Möglichkeiten 🤓 ich hoffe es wird so gut wie in meiner Vorstellung 😃
#goose #gooseai #lmstudio #llm #localllmCreated a tool for models.dev that will generate provider configurations for
#gooseai,
#mistralai vibe, and
@charmcli crush. This was pretty tedious otherwise!
https://github.com/james2doyle/models-dev-formatter
GitHub - james2doyle/models-dev-formatter: Convert models.dev API data into various tool configuration formats (crush, goose, vibe, etc.)
Convert models.dev API data into various tool configuration formats (crush, goose, vibe, etc.) - james2doyle/models-dev-formatter
GitHubYesterday I designed a #Python REPL with Large Language Model (LLM) integration. It supports any LLM by #OpenAI (e.g., #GPT3) and #GooseAI (e.g., #GPTNeoX).
It was a fun experiment! Check it out if you're interested.
https://iamleo.space/2023-02-20-llm-python-repl/
Wrapping the Python REPL with Large Language Models | Leonardo Hernández Cano's website
The code accompanying this post is at https://gitlab.com/da_doomer/natural-python.
In some cases it is now possible to write computer code mostly by prompting a Large Language Model (LLM). It seems there are two common modalities for this: (1) integrate the LLM call directly in an IDE (e.g., Copilot in VSCode) or (2) use something like ChatGPT or a playground to get source code to copy/paste in your editor.
Often, the top guess from the LLM (which is what is shown to the programmer) is almost correct, but still requires the programmer to fix small errors.