"Hallucinations" and misunderstandings" is in and of itself a misunderstanding of how LLMs work. They aren't factbots. They don't "know" things. They predict what text sounds like a plausible sentence. That's ALL they do. If they ever tell you the tr...