The Prompt() Function: Use the Power of LLMs with SQL

(motherduck.com)

32 points | by sebg 3 hours ago ago

8 comments

  • delichon 2 hours ago

      FROM hn.hacker_news
      LIMIT 100
    
    "Oops I forgot the limit clause and now owe MotherDuck and OpenAI $93 billion."
  • domoritz 3 hours ago

    I love the simplicity of this. Hurray for small models for small tasks.

  • korkybuchek 2 hours ago

    Interesting -- is there any impact from LLM outputs not being deterministic?

    • drdaeman 2 hours ago

      SQL functions can be non-deterministic just fine. E.g. SQL:2003 grammar defines DETERMINISTIC | NOT DETERMINISTIC characteristic for CREATE FUNCTION. Or, e.g. PostgreSQL has IMMUTABLE | STABLE | VOLATILE clauses.

    • xnx 2 hours ago

      Aren't LLM outputs deterministic given the same inputs?

      • simonw an hour ago

        Not at all. Even the ones that provide a "seed" parameter don't generally 100% guarantee you'll get back the same result.

        My understanding is that this is mainly down to how floating point arithmetic works. Any performant LLM will be executing a whole bunch of floating point arithmetic in parallel (usually on a GPU) - and that means that the order in which those operations finish can very slightly affect the result.

      • korkybuchek 2 hours ago

        They are not, necessarily. Especially when using commercial providers who may change models, finetunes, privacy layers, and all kinds of other non-foundational-model things without notice.