Thank you to everyone that attended our talk on writing #AI models in pure #Java 🥳 !!!
The slides Lize Raes and I used at #JFall are available below, but you can always connect with us and ask more questions on what we presented 😇!!!
https://www.slideshare.net/slideshow/writing-gpu-ready-ai-models-in-pure-java-with-babylon-08de/284064587
Writing GPU-Ready AI Models in Pure Java with Babylon
Project Babylon introduces the experimental Code Reflection technology that lets you define machine learning logic in plain Java code, without needing Python or external model files. It then uses Foreign Function and Memory (FFM) API to connect your code to native runtimes like ONNX Runtime for fast inference, including GPU acceleration. Furthermore, the Heterogeneous Accelerator Toolkit (HAT) provides a developer-facing programming model for writing and composing compute kernels, which can be more broadly applied-allowing Java libraries to seamlessly harness GPU power for high-performance computing tasks. Presented at JFall 2025 by Ana-Maria Mihalceanu and Lize Raes - Download as a PDF or view online for free