Token
The basic unit of text processing in LLMs. Roughly 1 token = 0.75 words in English. Tokens matter for GEO because AI search engines have limited context windows measured in tokens. Understanding tokenization helps optimize content structure — shorter, information-dense paragraphs with key claims front-loaded perform better in AI citations.