GENAIWIKI

Product

hallucination

Hallucination is confident generation that is not grounded in the provided context or true facts.

Expanded definition

hallucination shows up constantly when teams ship LLM features. Practically, it influences how you design prompts, evaluate quality, and reason about failure modes. Teams should document how hallucination manifests in their stack—data handling, evaluation, and runtime guardrails—and revisit assumptions as models update.

Related terms

Explore adjacent ideas in the knowledge graph.