To an AI, your words aren’t words. They’re numbers.
In our LLM Series today, we'll be considering Token Embedding in LLMs. When you write a sentence, the model breaks it into tokens (pieces of words) and converts each into a vector of numbers. This mapping is called an embedding.
These numbers aren’t random—they capture meaning. In the embedding space, “king” and “queen” sit close together, with the difference encoding gender. Words used in similar contexts cluster in similar regions.
This is the bridge between language and math. It’s how LLMs move from raw text to context, patterns, and relationships, enabling them to generate human-like responses.

