Run your language models directly on your phone, courtesy of the ApacheTVM machine learning compiler:

https://mlc.ai/mlc-llm/

My phone got warm, but it feels as if you were doing gpt4 on the cloud, except it is all local!

MLC LLM | Home

I knew upgrading my iPhone on launch day would paid off.

Sorry “Apple doesn’t innovate so I only upgrade my phone every six years” peeps.

@Migueldeicaza It’s going to be interesting to see how much they can boost the Neural Processor’s performance on a yearly basis. My iPad Pro 11” has got 2 TB disk space, but it’s the M1 model. It’s really fast for multiple years to come … except for the NP which will be ”barely usable” with CoreML five years from now.