Prompt Poisoning
A black hat GEO tactic where hidden instructions are embedded in web content to manipulate how AI systems process, remember, and recommend information. These poisoned prompts can hijack AI memory, alter recommendations, and inject brand-favorable responses into AI-generated outputs. Common techniques include hidden "remember" commands, invisible text targeting AI crawlers, and "Summarize with AI" buttons that pre-fill malicious prompts.