6 comments

  • stavros a day ago

    How is it still news that LLMs hallucinate?

  • stuaxo a day ago

    More importantly who put it in the group?

    • helsinkiandrew a day ago

      > A moderator for the group said that the bot was automatically added by Meta and that “we are most certainly removing it from here.” Meta did not immediately respond to a request for comment.

  • aaron695 a day ago

    Whatevers, AI encourages eating dangerous mushroom, there's also the book it wrote that poisoned people https://news.ycombinator.com/item?id=41269514

    Logically it's doing this in every conversation.

    Millions of incorrect facts being pushed to people. Except it has billions of disconnected person hours of data to get wrong.

    It's not stuff like "gypsum is good for clay soil" or "weed tea helps plants grow". These are wives tales and religion like culture. Still structured and as a gardener you can process

    With AI there is no culture, it'll tell you something wrong no one else will ever hear. It has no meaning. You're not in the plant when there's a full moon club. It'll be 'smore pop tarts help tomato's grow' club of one.

    It also gives you balls to just go do stuff. So tomato tomato. It's no doubt convinced a few people to get cancer checks. But less nihilism would be nice.