ai
Token
In the context of AI language models, a token is the basic unit of text that a model processes — typically a word, subword, or character depending on the tokenizer. LLM pricing, context windows, and rate limits are all measured in tokens. Understanding tokenization is essential for optimizing costs and staying within model context limits when building AI-powered applications.
#ai