yzma 1.11 is out, with more of what you need:
- Support for latest llama.cpp (>97% of functions covered)
- ROCm backend+benchmarks
- @arduino Uno Q install info
Go get it right now!
https://github.com/hybridgroup/yzma
#golang #llamacpp #yzma #arduino #unoq
- Support for latest llama.cpp (>97% of functions covered)
- ROCm backend+benchmarks
- @arduino Uno Q install info
Go get it right now!
https://github.com/hybridgroup/yzma
#golang #llamacpp #yzma #arduino #unoq

GitHub - hybridgroup/yzma: Go with your own intelligence - Go applications that directly integrate llama.cpp for local inference using hardware acceleration.
Go with your own intelligence - Go applications that directly integrate llama.cpp for local inference using hardware acceleration. - hybridgroup/yzma