WeeBytes
Start for free
What Is Hallucination in AI?
BeginnerAI BasicsLimitationsKnowledge

What Is Hallucination in AI?

Hallucination is when AI confidently states something completely false — fake statistics, made-up sources, wrong dates. This happens because AI predicts likely text, not verified facts. Knowing this keeps you from trusting AI blindly and helps you verify outputs for important work.

AI doesn't 'know' facts — it predicts likely text. Sometimes it generates plausible-sounding but totally fake information: made-up research papers, non-existent legal cases, wrong dates and statistics. This is called hallucination. Never trust AI outputs for critical facts without verification. The more obscure the topic, the higher the hallucination risk.

**Key takeaway:** AI can confidently lie. Always verify important facts.

hallucinationsafetybasicssupervised-learning-in-60-seconds

Want more like this?

WeeBytes delivers 25 cards like this every day — personalised to your interests.

Start learning for free