Applies to SUSE AI 1.0
1 What causes AI hallucinations? #
The most common causes of hallucinations are:
Ambiguous prompts. Vague queries can lead to random or inaccurate answers.
Lack of clear context. When the language model lacks context, it can fabricate answers.
Long generation length. The longer the generated response, the higher the chance that hallucinations can happen.
No retrieval-augmented process. LLMs without access to external sources—such as databases or search engines—can produce errors when they need to generate specific information.