SUSE AI 1.0
- WHAT?
AI hallucinations occurs when an LLM generates information that is not based on real-world facts or evidence. This can include fictional events, incorrect data or irrelevant outputs.
- WHY?
Learn to create effective prompts that can help AI generate accurate and reliable content.
- EFFORT
Less than 15 minutes of reading.
Publication Date: 2026-04-02
- 1 What causes AI hallucinations?
- 2 How can I prevent AI from generating hallucinations?
- Glossary
- A Copyright
- B GNU Free Documentation License
- B1 0. PREAMBLE
- B2 1. APPLICABILITY AND DEFINITIONS
- B3 2. VERBATIM COPYING
- B4 3. COPYING IN QUANTITY
- B5 4. MODIFICATIONS
- B6 5. COMBINING DOCUMENTS
- B7 6. COLLECTIONS OF DOCUMENTS
- B8 7. AGGREGATION WITH INDEPENDENT WORKS
- B9 8. TRANSLATION
- B10 9. TERMINATION
- B11 1. FUTURE REVISIONS OF THIS LICENSE
- B12 ADDENDUM: How to use this License for your documents