>_/GLOSSARY/hallucination
GLOSSARY // DEFINITION
CONCEPT

Hallucination

When an AI system generates false information presented as fact. In GEO, hallucinations are both a vulnerability (AI invents fake citations) and an attack vector (black hat operators create content designed to trigger specific hallucinations that benefit their brand). Ghost citations exploit this tendency.

← ALL TERMS
User IP: 192.168.x.x | Encryption: AES-256