Coalton Playground: Type-Safe Lisp in the Browser

(abacusnoir.com)

108 points | by reikonomusha 16 hours ago ago

36 comments

  • rurban 10 minutes ago

    What a hilariously wrong headline. It should have said strongly-typed lisp.

    Which is still less type-safe than a dynamically typed lisp. In every lisp every value carries its strong type with it. In a typed lisp you rely on the compiler to not exploit weak casts, in a dynamically typed lisp there are no such loopholes possible.

  • paul_h 24 minutes ago

    Would love to see DOM interoperation

  • paddy_m 13 hours ago

    I have heard multiple people claim that macros are incompatible with strong or static typing and I don't see why.

    If there were a lisp with optional static typing like typescript, it would seem to me to be completely possible to write macros that write types. In many cases it woudl do away with the need for generic types (and allow multiple competing syntaxes for dynamic types). Most interestingly it would allow you to write new generic forms instead of waiting for whatever the language designer gives you. It would also allow you access to types at runtime (which the typescript language designers took away).

    Maybe people were telling me that lisp style macros were incompatible with hindley millner typing, but I still don't see how. The macros would just emit a hindley milmner subset.

    What am I missing?

    • reikonomusha 13 hours ago

      As far as your first question is concerned, macros in and of themselves are not incompatible with a typed language. Coalton uses Lisp macros, for example, and they work seamlessly and as expected.

      But a Coalton macro is typically written in Lisp, not Coalton. This isn't in and of itself a problem---it's very easy and straightforward thus to write Common Lisp-style macros in Coalton. The difficulties arise when the macro must itself be (1) written in a statically typed manner, which gets into having a complete typing on your metalanguage (or your AST), and (2) allowed to access the type environment of the surrounding context in which the expansion is happening. If (2) should be accomplished, then the type-checking machinery must collaborate with the macro-expansion machinery, and that in practice makes both tasks very difficult to specify semantics for and implement.

      The language Hackett [1] worked toward solving the problem of having true typed and type-aware macros (they call them "type-aware" and "type-directed" macros [2]). Development unfortunately ceased ~7 years ago.

      [1] https://github.com/lexi-lambda/hackett

      [2] I think this video has a good discussion of Lisp, macros, and static types, from the perspective of implementing a Haskell in Racket. https://www.youtube.com/watch?v=5QQdI3P7MdY

      • spooky_deep 12 hours ago

        Since the macros run at compile time, I am ok with them not being statically checked. Statically checked macros seems like an academic curiosity.

        What am I missing?

        • reikonomusha 12 hours ago

          Then nothing. Macros work.

          Silly example:

              COALTON-USER> (defmacro rpn (x y op)
                              `(,op ,x ,y))
              RPN
          
              COALTON-USER> (coalton-toplevel
                              (define (double x)
                                (rpn 2 x *)))
              ;; DOUBLE :: ∀ A. NUM A ⇒ (A → A)
          
              COALTON-USER> (coalton (double 3.0))
              6.0
        • wk_end 12 hours ago

          One concern I’d have is that any type errors would be reported on the macro expanded code and thus be pretty much inscrutable outside of toy examples.

          • reikonomusha 12 hours ago

            I think you're right that debugging errors involving complicated macros can get difficult, but to at least make the situation more tolerable, when Coalton expands a macro, it remembers where the expansion came from, so an error will be reported in the right place with the right source code. For example, using the RPN macro from the sister comment, here's an intentional type error:

                COALTON-USER> (coalton (rpn "x" "y" +)) 
                  --> <macroexpansion>:1:9
                   |
                 1 |  (COALTON (RPN "x" "y" +))
                   |           ^^^^^^^^^^^^^^^ expression has type ∀. (NUM STRING) => STRING with unresolved constraint (NUM STRING)
                   |           ^^^^^^^^^^^^^^^ Add a type assertion with THE to resolve ambiguity
                   [Condition of type COALTON-IMPL/TYPECHECKER/BASE:TC-ERROR]
        • rscho 10 hours ago

          Statically checked macros are useful to introduce new syntax at no runtime cost.

        • taeric 11 hours ago

          The big miss here is that "compile time" is typically understood to be "batch compilation" time for languages. For Common LISP, macros run at read time. Which is often doable during runtime.

          • BoingBoomTschak 8 hours ago

            No, macros run at compile time (cf https://www.lispworks.com/documentation/HyperSpec/Body/03_bb...), you may be confusing macros and reader macros.

            • taeric an hour ago

              Fair that I was definitely mixing them in my thinking. My general point was supposed to be simply that "compile time" is probably not what people are thinking of when coming from other languages. I was clearly a bit too eager to try and say that, though. :(

    • kscarlet 12 hours ago

      It's easy to just stick them together, but to me (who writes too much Lisp for my own health) this is unsatisfactory.

      The dream: just like macro can be seen as a (staged) extension mechanism for Lisp evaluator, there should be an extension mechanism for the static type system, which allows me to define new types, define new syntax (like Haskell do-notation) which makes use of typing environment and expected type of current context (return-type polymorphism), etc.

      The reality: very few environments figure this out. In Coalton Lisp macros do work, but only at the level of untyped S-expr. A Lisp macro can't know about types of the variables in the lexical environment, or expected type of its own context. But it quite possibly works fine for the "typescript-like" use case you described.

      The problem I see: H-M type system isn't designed with extensibility in mind, and it's hopeless to make it extensible. More technical explanation of why it's hard to integrate with Lisp macro is that H-M relies on a unification-based inference stage which execution flow is very different from macro expansion.

      Possible solution: There's no fundamental reason why static type can't have something as powerful as Lisp macro. However first of all you would need an extensible type system, which seems to still be an open research problem. I think bidirectional type system is hopeful -- it's so different from H-M at a fundamental level though that I think it's hopeless to retrofit into Coalton.

      • JonChesterfield 12 hours ago

        What's the objection to putting type annotations on the macro, and refusing to compile code where the arguments to the macro don't match the type?

        • kscarlet 12 hours ago

          There's no objection, but I think this solves a different problem than above described, and is possibly a more complicate matter in itself than it may look like. First, are you describing a way to typecheck macro so that it doesn't generate ill-typed code? The type system that does this is active research (I think requires modal typing?). Also this does not increase the expressive power of untyped Lisp macro -- just making them safer, while I was proposing their expressive power is unsatisfactory and should be enhanced via a sort of reflection. That would only makes them even harder to type!

        • reikonomusha 12 hours ago

          Since a macro is a definition of syntax, I think you'd essentially need something like typing judgments to show how the typed elements of the syntax relate to one another, so that the type checker (e.g., a typical Hindley-Milner unifier) can use those rules. These are usually written as the fraction-looking things that show up in PLT papers. This is, as GP says, essentially extending the type system, which is a task fraught with peril (people write entire papers about type system extensions and their soundness, confluence, etc.).

          • JonChesterfield 11 hours ago

            Depends on your point of view really. I'd define a macro as a function with slightly weird evaluation rules.

            If you want to write a macro where any call to it which typechecks creates code which itself typechecks, you have to deal with eval of sexpr which is roughly a recursive typecheck on partial information, which sounds tractable to me.

    • deterministic an hour ago

      You are missing nothing. Haskell has two different macro systems: typed and untyped.

  • anentropic 14 hours ago

    Layout is broken on non-maximised window sizes (text overflows off the right edge of the page)

    EDIT: I'm referring to layout of the blog post in the thread title, the https://coalton.app/ is ok

    • anentropic 14 hours ago

      FYI coalton.app "Type Classes" example has "Error: unmatched close parenthesis" when you run it

      "JSON Parser" example also has errors

      • agambrahma 10 hours ago

        Thanks, I'll fix them soon

        • ks2048 4 hours ago

          Also, on smaller screens, in the coding page, the mostly empty header takes up half the screen.

      • agambrahma 5 hours ago

        Fixed now

  • CraigJPerry 14 hours ago

    Is Coalton not more about performance, removing the dynamicism - lisp (at least SBCL) is already type-safe. Or it behaves that way in my limited experience - e.g. i get feedback when i screw up.

    I'm completely clueless about Coalton, (and almost completely an idiot when it comes to CL more generally - been playing for a couple of years at this point but even so, every day is still a school day...)

    • skulk 14 hours ago

      FWIW, SBCL is pretty good at optimizing away dynamic type checks if you help it out.

      Here are some examples under:

          (declaim (optimize (speed 2)))
      
      First example is a generic multiplication. x and y could be _any_ type at all.

          (defun fn (x y) (* x y))
      
      If we disassemble this function, we get the following:

          ; disassembly for FN
          ; Size: 34 bytes. Origin: #x1001868692                        ; FN
          ; 92:       488975F8         MOV [RBP-8], RSI
          ; 96:       4C8945F0         MOV [RBP-16], R8
          ; 9A:       498BD0           MOV RDX, R8
          ; 9D:       488BFE           MOV RDI, RSI
          ; A0:       FF142540061050   CALL [#x50100640]                ; SB-VM::GENERIC-*
          ; A7:       4C8B45F0         MOV R8, [RBP-16]
          ; AB:       488B75F8         MOV RSI, [RBP-8]
          ; AF:       C9               LEAVE
          ; B0:       F8               CLC
          ; B1:       C3               RET
          ; B2:       CC0F             INT3 15                          ; Invalid argument count trap
      
      Note that it calls `GENERIC-*` which probably checks a lot of things and has a decent overhead.

      Now, if we tell it that x and y are bytes, it's going to give us much simpler code.

          (declaim (ftype (function ((unsigned-byte 8) (unsigned-byte 8)) (unsigned-byte 16)) fn-t))
          (defun fn-t (x y) (* x y))
      
      The resulting code uses the imul instruction.

          ; disassembly for FN-T
          ; Size: 15 bytes. Origin: #x1001868726                        ; FN-T
          ; 26:       498BD0           MOV RDX, R8
          ; 29:       48D1FA           SAR RDX, 1
          ; 2C:       480FAFD7         IMUL RDX, RDI
          ; 30:       C9               LEAVE
          ; 31:       F8               CLC
          ; 32:       C3               RET
          ; 33:       CC0F             INT3 15                          ; Invalid argument count trap*
      • matheusmoreira 13 hours ago

        Can SBCL actually check at compile time that the arguments to fn-t are bytes? I wonder how that works with Lisp's extreme dynamism. Also wondering about the calling convention it uses.

        • reikonomusha 12 hours ago

          It can detect simple-ish instances, like calling

              (fn-t "hello" "world")
          
          But a good rule-of-thumb is that these compile-time type errors are more of a courtesy, rather than a guarantee. As soon as you abstract over fn-t with another function, like so:

              (defun g (x y)
                (fn-t x y))
          
          and proceed to use g in your code, all the static checking won't happen anymore, because as far as g is concerned, it can take any input argument types.

              CL-USER> (defun will-it-type-error? ()
                         (g "x" "y"))
              ;; compilation WILL-IT-TYPE-ERROR? successful
          
          No compile-time warning is issued. Contrast with Coalton:

              COALTON-USER> (coalton-toplevel
                              (declare fn-t (U8 -> U8 -> U8))
                              (define (fn-t x y)
                                (* x y))
                              
                              (define (g x y)
                                (fn-t x y))
                              
                              (define (will-it-type-error?)
                                (g "hello" "world")))
          
              error: Type mismatch
                --> <macroexpansion>:8:7
                 |
               8 |      (G "hello" "world")))
                 |         ^^^^^^^ Expected type 'U8' but got 'STRING'
                 [Condition of type COALTON-IMPL/TYPECHECKER/BASE:TC-ERROR]
    • tmtvl 14 hours ago

      In CL you can't declare, for example, a proper-list-of type, which is to say a type which accepts a second type and represents a proper list containing only members of that second type.

        (deftype Proper-List-Of (subtype)
          `(or Null
               (Cons ,subtype
                     (Proper-List-Of ,subtype))))
      
      Doesn't work (for example). There kind of are ways to work around it to some extent with satisfies and ad-hoc predicate generation, but Coalton is a true value add in that aspect.
    • reikonomusha 13 hours ago

      I think it's three things:

      1. Bringing abstractions that are only possible with static types, like ad hoc polymorphism via type classes. For example, type classes allow polymorphism on the return type rather than the argument types. Something like

          (declare stringify (Into :a String => :a -> :a -> String))
          (define (stringify a b)
            (str:concat (into a) (into b)))
      
          ; COALTON-USER> (coalton (stringify 1 2))
          ; "12"
      
      The function `into` is not possible in a typical dynamically typed language, at least if we aim for the language to be efficient. It only takes one argument, but what it does depends on what it's expected to return. Here, it's expected to return a string, so it knows to convert the argument type to a string (should knowledge of how to do that be known by the compiler). Common Lisp's closest equivalents would be

          (concatenate 'string (coerce a 'string) (coerce b 'string))
      
      which, incidentally, won't actually do what we want.

      2. Making high performance more accessible. It's possible to get very high performance out of Common Lisp, but it usually leads to creating difficult or inextensible abstractions. A lot of very high performance Common Lisp code ends up effectively looking like monomorphic imperative code; it's the most practical way to coax the compiler into producing efficient assembly.

      Coalton, though, has an optimizing compiler that does (some amount of) heuristic inlining, representation selection, stack allocation, constant folding, call-site optimization, code motion, etc. Common Lisp often can't do certain optimizations because the language must respect the standard, which allows things to be redefined at run-time, for example. Coalton's delineation of "development" and "release" modes gives the programmer the option to say "I'm done!" and let the compiler rip through the code and optimize it.

      3. Type safety, of course, in the spirit of ML/Haskell/etc.

    • tgbugs 13 hours ago

      I highly recommend watching [0] for an introduction to Coalton in the context of CL. Specifically it provides an excellent example of how the type system makes the language more expressive (by making it more composable) while also improving performance (e.g. because it can prove that certain optimizations are safe and thus can automatically generate the type annotations).

      0. https://www.youtube.com/watch?v=of92m4XNgrM

    • wild_egg 14 hours ago

      CL is strongly typed but not statically typed. The compiler generally doesn't complain ahead of time that your function is going to do math on a string because it was called incorrectly. Typically a runtime condition will be signalled when it hits that point and you can sort it out from there.

      Coalton moves that to the compilation step so you get an error back the instant you send the form to the REPL.

    • asplake 14 hours ago

      I’ve looked at it rather than used it, but what it brings is ML-style polymorphism. Type safety is a given in that case, which may or may not be the case with CL (I’ll let others argue that one).

    • bitwize 12 hours ago

      Strong static typing is an absolute must for large scale software engineering. CL code can be made very performant without it, but there are enormous development speed and correctness gains to be had by type-checking programs before they are executed rather than at runtime.

      It's 2025, people. Dynamic languages for serious projects were a 90s fad, hopefully never to be repeated.

  • manoDev 12 hours ago

    OP: You might want to check, a bunch of examples after "maybe monad" are broken.

    • agambrahma 5 hours ago

      Fixed now, let me know if you find something broken

    • agambrahma 10 hours ago

      Thanks, I'll take a look and fix them soon