Back to glossaryDefinition

Token

A token is the basic unit of text that AI models process. Roughly equivalent to three-quarters of a word in English — "understanding" might be two tokens, "AI" one token, "the" one token. Tokens matter for two practical reasons: they determine the cost of using AI APIs (most are priced per token), and they determine how much content fits in a context window.

For most business users, tokens are an invisible implementation detail — but understanding the concept helps explain why very long documents sometimes need to be broken into sections.