Hallucinations: Difference between revisions

From Civitai Wiki
Jump to navigation Jump to search
(Created page with "Sometimes LLM models like ChatGPT produce information that sounds plausible but is nonsensical or entirely false. This is called a Hallucination.")
 
(No difference)

Latest revision as of 07:35, 11 October 2023

Sometimes LLM models like ChatGPT produce information that sounds plausible but is nonsensical or entirely false. This is called a Hallucination.