Hallucinations

From Civitai Wiki
Jump to navigation Jump to search

Sometimes LLM models like ChatGPT produce information that sounds plausible but is nonsensical or entirely false. This is called a Hallucination.