Kale 

@DarkestKale
538 Followers
132 Following
39.5K Posts
It'll all be fine.
Pronounshe/him, but honestly it doesn't matter to me
Good morning folks
... so, we're slowly pushing into 'perhaps the current hardware I have might not cope with this project, long term' - and then I go look at the price of fucking GPUs and clench my asshole for a while thinking 'wait, my current GPU is WHAT?'
Was tempted to do another training run on the model, but... the training time is now getting up there. Power-wise, no, it's not really consuming much (fortnite is more intense, tbh), but my actual time is more important in this one (since, I can't play fortnite, etc while it's cooking)

'Friendly reminder, linux is faster for inference than windows'

... and the poster has completely different environments for both usecases.

Yeah, that's a brilliant comparison, my dude.

> Has anyone else experienced a performance this large before? Am I missing something?

aka, saying authoritative things without actually knowing shit

My favourite comment of the day
Skimming articles about british ex-pats in Dubai talking up the danger, and boy, howdy, are these journos doing a HUGE amount of heavy lifting to not babsically say 'they're influencers paid off by Dubai, and... well, sex workers'
Fuckin tiiiiiiiiiiired today, folks.
Good morning folks

Spending a bit of time, tonight, slowly reworking/expanding the finetune dataset instead of dumping more into pretrain.

Pretrain SEEMS to be coming along, but glancing at the finetune, I think some of these simpler examples can be enhanced quite a bit.

You know.

when we're not maliciously defunding them.