Local LLM App by Ente

https://ente.com/blog/ensu/

Ensu - Ente's Local LLM app

Introducing Ensu, our first step toward a private, personal LLM app that runs on your device and grows with you over time.

ente
Maybe I’m missing it but the page is really light on technical information. Is this a quantized / distilled model of a larger LLM? Which one? How many parameters? What quantization? What T/s can I expect? What are the VRAM requirements? Etc etc
I have the same questions. After installing the app, it downloads 2.5 GB of data. I presume this is the model.