Convo-Lang: LLM Programming Language and Runtime

(learn.convo-lang.ai)

66 points | by handfuloflight 16 hours ago ago

30 comments

  • Disposal8433 15 hours ago

    The new COBOL. The next step is obviously to add syntax when you need to specify the type of the variables: put the type first, then the name and its value, and finish with a semicolon because it's fun, like "int n = 0;"

    • taneq 10 hours ago

      COBOL ? Hurrah! If there’s anything that would improve vibe coding, it’s a “come from“ statement. :P

      • Y_Y 7 hours ago

          MARKETING DIVISION
        • warkdarrior 5 hours ago

          "Divide by zero error encountered."

  • pryelluw 3 hours ago

    Like terraform for prompts.

    Put that on the landing page.

  • zuzuen_1 12 hours ago

    Perhaps when LLMs introduce a lot more primitives for modifying behvavior such a programming language would be necessary.

    As such for anyone working with LLMs, they know most of the work happens before and after the LLM call, like doing REST calls, saving to database, etc. Conventional programming languages work well for that purpose.

    Personally, I like JSON when the data is not too huge. Its easy to read (since it is hierarchical like most declarative formats) and parse.

    • zuzuen_1 12 hours ago

      One pain point such a PL could address is encoding tribal knowledge about optimal prompting strategies for various LLMs, which changes with each new model release.

  • brainless 14 hours ago

    I have thought of this issue quite a few times. I use Claude Code, Gemini CLI, etc. for all my new projects. Each of the typical CLAUDE.md/GEMINI.md file exists. I do not use MCPs. I ask agents to use `gh` command, all my work happens around Git/GitHub.

    But text is just that, while scripts are easier to rely on. I can prompt and document all mechanisms to, say, check code format. But once I add something, say a pre-commit hook, it becomes reliable.

    I am looking for a human readable (maybe renderable) way to codify patterns.

  • aurumque 5 hours ago

    This is a really great experiment that gets a lot of things right!

  • khalic 14 hours ago

    Cool concept that brings a little structure to prompts. I wouldn't use the semantic part that much, English is fine for this, but there is a real need for machine instructions. There is no need for an LLM guess if "main" is a function or a file for exemple.

  • brabel 11 hours ago

    I like it. Much nicer than having to use some python SDK in my opinion. Is this a standalone language or it requires Python or other languages to run it?

  • swoorup 12 hours ago

    Money Incinerator Lang would be fitting name as well.

  • machiaweliczny 16 hours ago

    Why not library?

  • benswerd 16 hours ago

    How do you think about remote configurability?

    Stuff like a lot of this needing to be A/B tested, models hot swapped, and versioned in a way thats accessible to non technical people?

    How do you think about this in relation to tools like BAML?

  • trehans 14 hours ago

    I'm not sure what this is about, would anyone mind ELI5?

    • xwowsersx 11 hours ago

      Not sure I'm sold on this particular implementation, but here's my best steelman: working with LLMs through plain text prompts can be brittle...tiny wording changes can alter outputs, context handling is improvised, and tool integration often means writing one-off glue code. This is meant to be DSL to add structure: break workflows into discrete steps, define vars, manage state, explicitly control when and how the model acts, and so on.

      It basically gives you a formal syntax for orchestrating multi-turn LLM interactions, integrating tool calls + managing context in a predictable, maintainable way...essentially trying being some structure to "prompt engineering" and make it a bit more like a proper, composable programming discipline/model.

      Something like that.

  • croes 16 hours ago

    Next step, an LLM that writes convo-lang programs to programs with an LLM

  • yewenjie 16 hours ago

    What is a motivating use case that this solves?

  • meindnoch 12 hours ago

      @on user
      > onAskAboutConvoLang() -> (
          if(??? (+ boolean /m last:3 task:Inspecting message)
              Did the user ask about Convo-Lang in their last message
          ???) then (
      
              @ragForMsg public/learn-convo
              ??? (+ respond /m task:Generating response about Convo-Lang)
                  Answer the users question using the following information about Convo-Lang
              ???
          )
      )
      
      > user
    
    Who in their right mind would come up with such a "syntax"? An LLM?
    • lnenad 10 hours ago

      I have to agree, it looks wild, even the simpler examples don't feel ergonomic.

    • ljm 9 hours ago

      … I think I’ll just stick with pydantic AI for now

  • mrs6969 15 hours ago

    Nice try. We will eventually get there, but I think this can and need to get better.

  • bn-l 16 hours ago

    It’s a noisy / busy syntax. Just my own opinion.

  • devops000 15 hours ago

    Why not as a library in Ruby or Python?

  • dmundhra 14 hours ago

    How is it different than DSPy?

    • xwowsersx 11 hours ago

      I haven't used DSPy this much, but as I understand it: this lang is more like an orchestration DSL for writing and running LLM conversations and tools, whereas DSPy is a framework that compiles and optimizes LLM programs into better-performing prompts...like DSPy has automatic improvement of pipelines using its compilers/optimizers. With DSPy you deal with modules and signatures.

  • gnubee 16 hours ago

    This looks a lot like another effective way of interacting with LLMs: english-lang. Some of english-lang 's features are that it can be used to convey meaning, and it's largely accepted (network effect!). I'm excited to see what convo brings to the table /s

    • ttoinou 12 hours ago

      You're absolutely right!