I made a BF compiler that goes really fast. I'm in a compilers class right now and for whatever reason my professor has gotten severely nerd snipped into building the fastest BF compiler in the world.
Mine is pretty darn fast; it compiles to native ARM assembly and can generate a Mandelbrot set in less than 0.5 seconds on my M1 Pro machine. Full writeup (with links to source) here: https://lambdaland.org/posts/2024-10-22_bf_writeup/
A while ago I wrote a Zig brainfuck implementation[1] that converts Brainfuck code into an executable function at comptime. The performance is similar to transpiling the code to C and compiling it with full optimization settings.
Love the shoutout to Adam B. Dude has helped me so many times. He has a criminally under appreciated YouTube channel where he solves pretty complicated problems in real time at the REPL and talks his way thru and by the end of it you actually feel like you understand wtf just happened. At least until you go try it yourself.
Here's some function list usage: for V0 here, branching on each potential character is inefficient; so an option is to do an array index-of among the characters, and pick out the respective function from a function list: https://dzaima.github.io/paste/#0dZLPSgJRFMb38xQXNyajNRPUwqy... (and, for a duration of 1 glyph, there's even a matrix containing functions there!)
> APL is not only adjacent to math, but also emoji
Syntactically, mathematicians do tend to communicate via fair lengths of natural language interspersed with [greek with a smattering of hebrew, hiragana, etc.]; it hadn't occurred to me before now that the general population now also communicates with shorter lengths of natural language interspersed with emoji.
Do the generations which grew up with emoji have any less math phobia than mine did?
> fair lengths of natural language interspersed with [greek with a smattering of hebrew, hiragana, etc.]
Also some made up stuff also. For example in financial maths, the the derivative of an option's price with respect to volatility is called "vega". Why? Well because all the other option derivatives are named using Greek letters (delta, gamma, rho etc) and "vega" sounds sort of Greek even though it's just made up.[1]
The original BF implementation used a fixed-size memory of 30,000 cells and out-of-bound pointers weren't checked, making it highly unsafe. Most modern implementations support dynamically sized memory or at least bound checks though, so they are as safe as Rust.
I use a variant of BF for linear genetic programming experiments that wraps the memory pointer around when it hits the edges. This increases the chances of finding viable candidates by reducing the impact of bad mutations.
I made a BF compiler that goes really fast. I'm in a compilers class right now and for whatever reason my professor has gotten severely nerd snipped into building the fastest BF compiler in the world.
Mine is pretty darn fast; it compiles to native ARM assembly and can generate a Mandelbrot set in less than 0.5 seconds on my M1 Pro machine. Full writeup (with links to source) here: https://lambdaland.org/posts/2024-10-22_bf_writeup/
A while ago I wrote a Zig brainfuck implementation[1] that converts Brainfuck code into an executable function at comptime. The performance is similar to transpiling the code to C and compiling it with full optimization settings.
[1] https://github.com/ginkgo/Zig-comptime-brainfuck
And written in racket!
Looks like a good learning material. Congratulations and thanks for sharing
Love the shoutout to Adam B. Dude has helped me so many times. He has a criminally under appreciated YouTube channel where he solves pretty complicated problems in real time at the REPL and talks his way thru and by the end of it you actually feel like you understand wtf just happened. At least until you go try it yourself.
Give Adam some love (or at least likes!):
https://youtube.com/@abrudz
I love APL. For a somewhat more modern, smaller take, try this: https://mlochbaum.github.io/BQN/
I was really impressed with BQN as a modern APL. Give it a look.
Love the idea of ndarrays of first class functions, although I'm still trying to figure out what I'd do with all that power...
Here's some function list usage: for V0 here, branching on each potential character is inefficient; so an option is to do an array index-of among the characters, and pick out the respective function from a function list: https://dzaima.github.io/paste/#0dZLPSgJRFMb38xQXNyajNRPUwqy... (and, for a duration of 1 glyph, there's even a matrix containing functions there!)
This is kind of funny. Implementing bf in something that is more-or-less just as brain-scrambling itself.
Actually, learning APL with a sort of "beginners mind" is not that big of a deal. It is sort of adjacent to learning math symbols themselves.
It's just when you start learning other sorts of programming languages you start to diverge.
Strangely nowadays, I think APL is not only adjacent to math, but also emoji.
> APL is not only adjacent to math, but also emoji
Syntactically, mathematicians do tend to communicate via fair lengths of natural language interspersed with [greek with a smattering of hebrew, hiragana, etc.]; it hadn't occurred to me before now that the general population now also communicates with shorter lengths of natural language interspersed with emoji.
Do the generations which grew up with emoji have any less math phobia than mine did?
> fair lengths of natural language interspersed with [greek with a smattering of hebrew, hiragana, etc.]
Also some made up stuff also. For example in financial maths, the the derivative of an option's price with respect to volatility is called "vega". Why? Well because all the other option derivatives are named using Greek letters (delta, gamma, rho etc) and "vega" sounds sort of Greek even though it's just made up.[1]
[1] I haven't heard a convincing etymology for vega in financial maths but it's generally written using the letter "nu" which just looks like a "v" https://en.wikipedia.org/wiki/Greeks_(finance)#Vega
Pedantry: astronomical vega comes from arabic, واقع .
https://en.wikipedia.org/wiki/Vega#Nomenclature
Implementing Brainfuck in APL should require doing it in cuneiform.
Someone should release an APL that uses cuneiform. Might be easier to learn.
As if APL alone wasn't enough.
Came here to note that this seem to accomplish little that's not already there.
Well ... consider my brain fucked
How does Brainfuck compare to Rust?
It's way more portable: You can run Brainfuck on an old HP calculator, where Rust would be too bloated: https://forum.swissmicros.com/viewtopic.php?f=23&t=1903&p=83...
The original BF implementation used a fixed-size memory of 30,000 cells and out-of-bound pointers weren't checked, making it highly unsafe. Most modern implementations support dynamically sized memory or at least bound checks though, so they are as safe as Rust.
I use a variant of BF for linear genetic programming experiments that wraps the memory pointer around when it hits the edges. This increases the chances of finding viable candidates by reducing the impact of bad mutations.
The conversation involving "most modern BF implementations" is always a great one.
The syntax of BF is far simpler, so it should be easier to get started than rust.
[flagged]
Yeah but the language has the F word in it, teehee. It's so funny. :D
Le bacon narwhals at midnight my friend. Here's an updoot for your service.
EDIT: OH ME GOOSH DIDNT EXPECT SO MUCH LOVE THANK YOUU
EDIT 2: WOAH SILVER HOLY RHIS IS TAKING OFF THANK YOU
EDIT 3: GOLD!?!?!? YALLS TOO NICE
EDIT 4: THANK YOU KIND STRANGERS
EDIT 5: OMG RIP MY INBOX