10 comments

  • deeponey 3 hours ago

    Are they going to try to make a "we're just a platform, don't shoot the messenger" section 230 argument (not sure what the equivalent in Canada is) for the AI overviews they generate? Seems like a bridge too far. Really hopeful the courts will side with Ashley MacIsaac here, and set some sane precedent.

    • cactacea 3 hours ago

      There isn't one.

      • winocm 2 hours ago

        "AI can make mistakes, so double-check responses."

  • 1attice 3 hours ago

    This is especially troubling from a sociological perspective, as it points to how AIs turn malice into false history.

    Ashley MacIsaac made waves in the nineties for being openly gay, and he paid his dues for years. I vividly recall being around a barroom table in the late nineties, listening to this specific slander. We knew it was slander though, because there was no evidence. We had no machine yet to confabulate it.

    This is what we anglos do to our men who prefer men. We did it with Wilde, and with Turing, and we did it with MacIsaac, and we are doing it even harder in 2026 than in 1996, because what we called freedom is now called "woke", and what was called dictatorship is now called "freedom".

    And you're next, dear reader.

  • chrisjj 4 hours ago

    > Google should not have lesser liability because the defamatory statements were published by software that Google created and controls.

    Therein lies the rub. Google does not control what its parrot spouts. No-one does.

    • mft_ 21 minutes ago

      If Anthropic can implement a regular expression to monitor for user frustration, Google have certainty got the chops to have some sort of heuristic to check for strongly negative statements.

    • thrownthatway 3 hours ago

      That’s one perspective.

      It’s wrong.

      But it’s definitely a perspective.

    • BizarroLand 2 hours ago

      Parents have to pay penalties when their underaged children burn down a building.

      Companies that get treated with the rights of people should also have the responsibilities of people. Google designed, built, hosted, and promoted their LLM prominently. Logically, it follows that they should be personally and financially responsible for any harms their LLM causes.

      • chrisjj 25 minutes ago

        Sure they should have the responsibility. Even more so given they don't have control.

    • grouchomarx 3 hours ago

      ah well, no worries then