πŸš€πŸ€– "Brilliant" minds at Thinking Machines Lab have unearthed groundbreaking revelations that bigger is, indeed, better when it comes to #AI models. πŸ™„ Turns out, posttraining on smaller datasets is just a total waste because who needs efficiency when you have infinite computing power and time, right? πŸ˜‚
https://thinkingmachines.ai/blog/lora/ #Research #BiggerIsBetter #ThinkingMachinesLab #DataEfficiency #InfiniteComputing #HackerNews #ngated
LoRA Without Regret

How LoRA matches full training performance more broadly than expected.

Thinking Machines Lab