@RolandRides Nice! This could be chunked up and added to a model for training knowledge. I'm going to try that this week!
@RolandRides Also highly recommend Hostinger VPS if you don't want to do it local. I did both. Started local, but wanted a production environment for another project. I paid around $90 for a year with a referral.
https://hostinger.com?REFERRALCODE=8OGPIXEL8LPH
Hostinger - Bring Your Idea Online With a Website
Hostinger@RolandRides RTX 4090 24GB, i9-14900KF, liquid cooler, MSI Pro board. I have models based on gpt 4-0613, deepseek-coder, llama3... a bunch... I use Ollama evaluations to determine the best setup. Performance currently varies while I test different configs. I have chunked up code, added to the knowledge base for the RAG system but now I need to fine tune - I have a lot to learn there.
Fully hosted Ollama ✅
Low-cost unlimited LLM usage ✅
RAG codebase and database✅
Learning how to fine-tine 🫠
#ollama #qlora #llm #litellm #ai