Run your language models directly on your phone, courtesy of the ApacheTVM machine learning compiler:

https://mlc.ai/mlc-llm/

My phone got warm, but it feels as if you were doing gpt4 on the cloud, except it is all local!

MLC LLM | Home

@Migueldeicaza I prefer the classic /dev/urandom
@NeoNacho /dev/random is what we call in LLM circles “a diamond in the rough”