For the last decade, "smart" meant "connected." We built better networking libraries and faster serialization, but we were still fighting physics.

The rise of capable silicon has rewritten the rules. The default is no longer "call the API." The default is "run it locally."

#EdgeAI #CoreML #AppleSilicon #iOSDev #MachineLearning

Why can't you train an LLM on your iPhone?
It's not just the speed. It's the Memory Physics.

Ep 6 of Sandboxed is out: "Training vs. Inference."
We break down the wall between "Learning" and "Doing."

https://logicbridge.dev/sandboxed/6

#OnDeviceAI #CoreML #iOSDev #MachineLearning #AppleSilicon

🎉 Wow, a #CLI for #Apple Core ML models—because GUIs are for quitters, right? 🤔 Now you can enjoy the thrilling experience of typing commands while praying for no "syntax error" 🙈. Finally, a way to make #AI model work feel like you're #hacking the mainframe in a 90s movie! 💻🔍
https://github.com/schappim/coreml-cli #CoreML #SyntaxError #HackerNews #ngated
GitHub - schappim/coreml-cli: A native command-line interface for working with Apple Core ML models on macOS

A native command-line interface for working with Apple Core ML models on macOS - schappim/coreml-cli

GitHub

App Swift dùng CoreML (2‑6× nhanh hơn, tiêu thụ ít pin) tích hợp YOLO12n, nhận diện cảm xúc, bảo vệ màn hình khi phát hiện 2 người, và Pomodoro theo dõi chú ý bằng Vision. Hoạt động toàn bộ trên thiết bị, không gửi dữ liệu ra ngoài. 🚀📱

#AI #Swift #CoreML #MacVision #CôngNghệ #BảoMật #Pomodoro #VisionFramework #NhậnDienCảmXúc

https://www.reddit.com/r/SideProject/comments/1qcjnzo/mac_vision_tools_a_menu_bar_app_for_fun_tasks/

🚀 Mac Vision Tools: ứng dụng thanh menu macOS dùng mô hình CoreML chạy trên Neural Engine. Tính năng: phát hiện vật thể (YOLO12n), khóa màn hình khi phát hiện 2 người (Privacy Guard), nhận diện cảm xúc khuôn mặt, đồng hồ Pomodoro theo dõi chú ý. Hoàn toàn xử lý cục bộ, tiêu thụ ít pin. #MacVisionTools #AI #Swift #CoreML #NeuralEngine #Privacy #Pomodoro #CôngNghệ

https://www.reddit.com/r/SideProject/comments/1qcjnzo/mac_vision_tools_a_menu_bar_app_for_fun_tasks/

🎧 Most Core ML “failures” are task mismatch failures.

Classification = identity (what)
Detection = location (what + where)
Segmentation = pixel masks (which pixels)

The simplest task that satisfies your UI is usually the best architecture.

Listen: https://logicbridge.dev/sandboxed/4

#iOSDev #CoreML #OnDeviceAI #Vision

Your Core ML model isn’t a black box.
It’s adjustable logic.

And if your accuracy is *suspiciously* high… it might be cheating.

In Episode 3 of Sandboxed, we translate ML jargon into an iOS-developer mental model:
weights + biases as knobs you tune, loss as a measurable error signal, and training as a feedback loop that feels a lot like build-and-test.

https://logicbridge.dev/sandboxed/3

#iOSDevelopment #CoreML #OnDeviceAI #MachineLearning #Sandboxed

🎧 99% accuracy? Your model might be cheating.

Episode 03 explains “learning” as: forward pass → loss → nudge weights.
Then why models fail in real apps: overfitting + data leakage.

What’s the sneakiest shortcut you’ve seen in data?

Listen now: https://logicbridge.dev/sandboxed/3

#iOSDev #CoreML

📉 So, it turns out #ONNX and #CoreML have a sneaky habit of downgrading your models to #FP16 without so much as a polite cough. 🤦‍♂️ But don't worry, there's a hero's journey through a forest of matrices and formats to fix this *not-a-bug*. Design choices, amirite? 😂
https://ym2132.github.io/ONNX_MLProgram_NN_exploration #ModelDowngrade #DataScience #HackerNews #ngated
ONNX Runtime & CoreML May Silently Convert Your Model to FP16 (And How to Stop It)

ONNX Runtime & CoreML May Silently Convert Your Model to FP16 (And How to Stop It)