Google parchea un zero-day de Chrome en WebGPU (CVE-2026-5281) que ya se está explotando

Google ha publicado una actualización de Chrome para corregir el zero-day CVE-2026-5281, un fallo use-after-free en Dawn/WebGPU que está siendo explotado activamente. La recomendación es actualizar…

Una Al Día
Yo, I heard you like #WebGPU, so I put some #WebGPU in your #WebGPU.

New update for the slides of my talk "Run LLMs Locally": WebGPU

Now models can run completely inside the browser using Transformers.js, Vulkan and WebGPU (slower than llama.cpp, but already usable).

https://codeberg.org/thbley/talks/raw/branch/main/Run_LLMs_Locally_2026_ThomasBley.pdf

#ai #llm #llamacpp #stablediffusion #gptoss #qwen3 #glm #localai #webgpu

don't expect llm generated code to be correct ↓

Finally, chair-o-plane safety testing can be done from the comfort of your browser. Thanks, #WebGPU!

l forgot to add a link to the tool I mentioned yesterday (in the post above):

https://ghadeeras.github.io/pages/sketch

Currently, it only works in Chrome because it uses #WebGPU. The controls are rather awkwardly unconventional and unintuitive because I am too lazy to implement proper ones. Click on the joystick icon to learn about them. I think that a tablet, preferably with a stylus, would be the friendliest way to use it. It's still so much a work in progress though. More to come.

#splines

Sketch

What is the year Firefox and Google Chrome support WebGPU in Linux?

#Linux #AMD #Mesa #WebGPU #Firefox #GoogleChrome

AA (@measure_plan)

three.js, tsl-webgpu, mediapipe 손 추적, kimi k2.5를 활용해 라센간(Rasengan) 블라스트 효과를 구현한 데모와 코드 공유 트윗입니다. AI/컴퓨터비전과 웹GPU를 결합한 창의적 인터랙티브 시연으로, 개발자들에게 참고할 만한 흥미로운 구현 사례입니다.

https://x.com/measure_plan/status/2038660367014891687

#threejs #webgpu #mediapipe #computervision #aidemo

AA (@measure_plan) on X

made a rasengan blast effect with threejs, tsl-webgpu, mediapipe hand tracking, kimi k2.5 live demo and code below 🌀

X (formerly Twitter)

I can't believe I spent close to 4 weeks developing a seemingly trivial tool.

I wanted a tool with which I could 1) write/draw on a tablet, using a stylus, then replay/animate the act of drawing/handwriting. I also wanted it to 2) produce smooth strokes that hide the jagged lines caused by an unsteady hand and a slippery tablet surface.

The former goal was the easy one (although it is not quite complete yet). The latter however turned out to be quite challenging. I think I rewrote it gazillion times before I got what I thought was a decent result.

I hope it was not a waste of time and that I subconsciously learned something from this exercise.

I used #webgpu to compute and render the "strokes". Not sure if that is an overkill, but I thought it was better to delegate such computations to the GPU to keep the UI responsive.

Stacks! Low stacks, high stacks, lots of stacks! Cloth, ropes, bridges, springs! 50,000 block destruction in realtime! The #WebGPU physics engine, based on last year's SIGGRAPH "Real-Time Live!" winner AVBD, is almost here. Follow for updates!

Switching on and off a color gradient in the particle simulation

#webgpu #webdev #wgsl #particles #javascript #computergraphics #graphicsprogramming #cgi #screenshotSaturday