robots.txt
A standard text file that instructs web crawlers which pages they can and cannot access. In GEO, robots.txt is used to manage AI crawler access — allowing beneficial indexing while potentially blocking unauthorized AI training. Now complemented by llms.txt for AI-specific guidance.