🤡 Ah, the mystical GPT-3, cramming billions of ideas into a measly 12,288 dimensions, like a clown car of #concepts. 🚗🌌 Yet again, the internet is wowed by mathematical gibberish, while the rest of us are left wondering if this is just AI's way of #trolling humanity. 👀✨
https://nickyoder.com/johnson-lindenstrauss/ #GPT3 #ClownCar #AI #Mathematics #HackerNews #ngated
https://nickyoder.com/johnson-lindenstrauss/ #GPT3 #ClownCar #AI #Mathematics #HackerNews #ngated
Beyond Orthogonality: How Language Models Pack Billions of Concepts into 12,000 Dimensions
In a recent 3Blue1Brown video series on transformer models, Grant Sanderson posed a fascinating question: How can a relatively modest embedding space of 12,288 dimensions (GPT-3) accommodate millions of distinct real-world concepts? The answer lies at the intersection of high-dimensional geometry and a remarkable mathematical result known as the