Hallucination
When an AI system generates false information presented as fact. In GEO, hallucinations are both a vulnerability (AI invents fake citations) and an attack vector (black hat operators create content designed to trigger specific hallucinations that benefit their brand). Ghost citations exploit this tendency.