Hacker News new | past | comments | ask | show | jobs | submit login
John's Lambda Calculus and Combinatory Logic Playground (tromp.github.io)
97 points by mmphosis on March 12, 2023 | hide | past | favorite | 10 comments



I happen to be preparing an upcoming talk about this on Pi day [1]. Happy to answer any questions!

[1] https://www.meetup.com/lispnyc/events/290789901/


Before I spend a big pile of CPU time trying to find a shorter BLC self-interpreter, why won't it work?


It might just work, if you find a smarter approach than I did.

Will buy you a dozen beer if you succeed!


Hey, just want to say that BLC is a remarkable artifact -- to me it's an art piece in computational minimalism.

I actually got so obsessed with it last year that I worked out a variant of lambda calculus[0] in which, with some trickery, a port of the BLC interpreter could be squeezed into 194 bits.

Which would be only 2 bits more than the intriguing conjecture from your paper[0] says to be optimal:

> We conjecture that any self-interpreter for any binary representation of lambda calculus must be at least 24 bytes in size, which would make E close to optimal.

I wonder what are the assumptions behind this conjecture. Surely my trickery broke some of them.

[0] https://xtao.org/blog/last-intro.html

[1] https://tromp.github.io/cl/LC.pdf


LAST is an interesting variation, that is in essence identical to the oddly named "Real Fast Nora's Hair Salon 3: Shear Disaster Download" language [1]. Instead of L A S T, it names the 4 tokens LAMBDA APP ONE_MORE_THAN ZERO. I noticed that using two separate tokens for variable handling allows BLC to interpret LAST in only 193 bits.

Still, I suspect that for most programs, the savings from S-optimization do not quite make up for the (n-1) extra bits needed for every occurrence of variable n. What would for instance be the length of the shortest LAST program for the prime number character sequence, which takes 167 bits in BLC?

> I wonder what are the assumptions behind this conjecture.

I chose 24 bytes because it's a nice round number (3 * 2^3 * 2^3 bits) that sat a seemingly comfortable 14 bit margin below my best effort.

The conjecture assumes a binary input, that must be read bit-by-bit. How long is your LAST self-interpreter with a binary rather than quaternary input?

[1] https://esolangs.org/wiki/Real_Fast_Nora%27s_Hair_Salon_3:_S...



> The other encodings for variables would certainly increase parsing complexity, so self-interpreters for these BLC variants would be much longer than the original.

Indeed this is the reason why I find those alternatives not too interesting. In most practical programs the variable index frequencies are reasonably well approximated by an exponential distribution for which the very simple unary encoding is optimal.

> In the end, as we relax the restrictions on the encoding of the input, it seems that we can decrease the length of the self-interpreter almost arbitrarily -- down to λm.m(λx.x)(λx.x) [Mogensen] (as pointed out by @sargstuff in this thread) or even λm.m(λx.x) [Brown and Palsberg]

Those are of a very different nature; mine has to tokenize and parse from a binary stream; theirs already has the whole term in a higher order abstract syntax tree.

> While obsessing about this and letting the mind wander I noticed the nice direct correspondence between the quaternary encoding of LAST and the DNA/RNA nucleobases

Yep; I didn't fail to notice that link either:-)

> Now this is a pretty cool coincidence that can provide some individuals with peculiar interests with even more peculiar entertainment, such as going through all possible permuations of AGCT, looking for the longest valid LC program embedded in random mosquito DNA.

They only way to not be valid is to run out of DNA or to not be closed. If you find a long repetition of a single nucleotide and make that L then the latter is unlikely to happen soon...

> Thank you for coherently writing down your thoughts for anybody to build upon what you have figured out. It's fun, it's inspiring, and it works!

Thanks for all the effort in creating and explaining LAST. I found it intellectually stimulating!


Conjecture on why 194 bits[0] and not 192 bits[1]:

a) CS off by one error (started bit count at 1 instead of 0 )

b) Included 'NIL' as a bit.

[0] https://xtao.org/blog/last-intro.html

[1] https://tromp.github.io/cl/LC.pdf


Snarky conjecture: 7bit ascii in-ram version of "λm.m(λx.x)(λx.x)" is mxx which if hardware supports bit addressing only uses 21 bits. Smallest ascii lisp is 14bits though.


It has a beat, and you can dance to it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: