>_/GLOSSARY/token
GLOSSARY // DEFINITION
CONCEPT

Token

The basic unit of text processing in LLMs. Roughly 1 token = 0.75 words in English. Tokens matter for GEO because AI search engines have limited context windows measured in tokens. Understanding tokenization helps optimize content structure — shorter, information-dense paragraphs with key claims front-loaded perform better in AI citations.

REFERENCED BY // 1 TERMS
← ALL TERMS
User IP: 192.168.x.x | Encryption: AES-256