LLM Hallucination Seems Like a Big Problem, Not a Mere Speedbump

(freddiedeboer.substack.com)

8 points | by blueridge a day ago ago

2 comments

  • poulpy123 a day ago

    LLM are statistical language models. They don't hallucinate because they have no brain or senses for that.

    • more_corn 6 hours ago

      Pedantic comment. It’s commonly understood that hallucination means “made up crap generated by an LLM” we could push for a better name like fabrication, but then we have to re-train all the 95% of the population who don’t even know LLMs aren’t trustworthy.