Have you tried “tightening” socks?
When I began running I had blisters almost every time I was running. And with those types of socks, I rarely have any blister on my feet.
I also don’t really notice a difference of efficiency between when it’s raining or not, so I think you could try that.
Well, I use my laptop as a daily-driver, so training an AI in the background, even when I don’t use it seems a bit complicated.
The Markov chain seems like an interesting alternative for what I’m looking, does any tools to use one exist or should I build one from scratch?
That seems pretty disappointing. It seemed to me like it could have been somewhat possible.
I’ve trained a 0.8M parameters model and it was spitting out something that looked like French, not French though. So I need to test it but I feel like if I do it with some millions of parameters it could work. It still wouldn’t have a coherence but at least it could form real sentences.
Again I don’t know much about this, so I’m surely wrong.
I also think the dataset may be the issue, I didn’t use a general purposed dataset, only French books in a txt file.
Training a model without a GPU - sh.itjust.works
Hi, I’m currently starting to learn how LLM works in depth, so I started using
nanoGPT to understand how to train a model and I’d like to play around with the
code a little more. So I set myself a goal to train a model that can write basic
French, it doesn’t to be coherent or deep in its writing, just French with
correct grammar. I only have a laptop that doesn’t have a proper GPU, so I can’t
really train a model with billions of parameters. Do you think it’s possible
without too much dataset or intensive training? Is it a better idea if I use
something different from nanoGPT? TLDR: I’d like to train my own LLM on my
laptop which doesn’t have a GPU. It’s only for learning purpose, so my goal is
that it can write basic French. Is it doable? If it is, do you have any tips to
make this easier?