Shameless Guesses, Not Hallucinations

(astralcodexten.com)

2 points | by toomuchtodo 6 hours ago ago

1 comments

  • nickpsecurity 6 hours ago

    They don't make up things for the reason we do. They usually don't have a sense of self, introspection, etc driving their actions. They have no memory like the hippocampus outside some academic prototypes. They don't have a combo of innate and observational knowledge of real-world things grounding their language interpretation.

    Their worse than us because the brain God designed is a far better architecture. I also hypothesize it has a hallucination mitigator, maybe several. Authors need to stop repeating the false claim that LLM's are like us.