Coffee is Love & Peace! ☕🤎✌🏻
#Jax #TheAmazingDigitalCircus #coffecore

Low key, is the new #tadc episode a commentary on the faults of generative AI?

#theamazingdigitalcirus #caine #pomni #jax #kinger

很多 Python 加速的方法

Gea-Suan Lin's BLOG

Here's some honorable mentions for fanart. I am honestly surprised I had as many as I did! Do you recognize any of these guys?

https://subscribestar.adult/pyperhaylie
https://naughtynewsroom.carrd.co/
#bondage #atlantis #fifthelement #tadc #finalfantasy #aerith #cloud #ponyplay #spanking #barefoot #feet #jax #ragatha #gangle #kida

🌘 將 Flash Attention 強制移植至 TPU:慘痛的學習經驗
➤ 從 GPU 實作到 TPU 編譯器的技術挑戰與啟示
https://archerzhang.me/forcing-flash-attention-onto-a-tpu
本文記錄了作者將 Flash Attention 演算法從 GPU 的 Triton 環境移植至 TPU 的過程。作者原以為這僅是單純的程式碼轉譯,卻發現 TPU 的程式模型(JAX/XLA)與 GPU 的指令控制思維大相徑庭。在 GPU 上,開發者透過 Triton 直接操作記憶體指標(Mutable);但在 TPU 上,JAX 採用不可變(Immutable)的函數式編程,必須透過 XLA 編譯器處理計算圖。作者詳細說明瞭如何利用 `jax.lax.fori_loop` 與 `dynamic_update_slice` 處理迴圈與狀態更新,最終揭示了跨硬體平臺移植時,必須深刻理解編譯器優化與硬體資料流機制的重要性。
+ 這篇文章精準地點出了 JAX 的學習曲線。很多人誤以
#人工智慧 #TPU #JAX #程式設計 #機器學習工程
Forcing Flash Attention onto a TPU and Learning the Hard Way · Archer Zhang

This is the fifth post in a series on LLM internals. Part 1 covered attention, Part 2 covered generation, Part 3 covered the Flash Attention algorithm, Part ...

As the start of my spree of IWD-inspired #StupidAccordionTricks cover songs this month, I kick it off with #Jax's #VictoriasSecret song. (Why? Epstein, sadly.)

https://www.youtube.com/watch?v=vQG1X2ca9ok

Victoria's Secret (Jax, 2022) ... played on the accordion!

YouTube