Selfhosted LLM (ChatGPT)

https://lemmy.world/post/1244736

Selfhosted LLM (ChatGPT) - Lemmy.world

I’ve recently played with the idea of self hosting a LLM. I am aware that it will not reach GPT4 levels, but beeing free from restraining prompts with confidential data is very nice tool for me to have. Has anyone got experience with this? Any recommendations? I have downloaded the full Reddit dataset so I could retrain the model on this one as selected communities provide immense value and knowledge (hehe this is exactly what reddit, twitter etc. are trying to avoid…)

I’m about to start this journey myself. I found this, which looks promising: https://github.com/ggerganov/llama.cpp

Would be nice if someone here with some experience could share.

GitHub - ggerganov/llama.cpp: LLM inference in C/C++

LLM inference in C/C++. Contribute to ggerganov/llama.cpp development by creating an account on GitHub.

GitHub

I think I set that up successfully on a vm under windows.

It’s obviously a level worse than chatgpt but it worked surprisingly well otherwise. Poorer answers but still not bad.