Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I read On Lisp by Graham recently and first thought "this is the best programming book I read in a while", and then had the urge to make copy editing kind of changes "he didn't define nconc" and then thought "if he was using Clojure he wouldn't be fighting with nconc", and by the end thought "most of the magic is in functions, mostly he gets efficiency out of macros, the one case that really needs macros is the use of continuations" and "I'm disappointed he didn't write any macros that do a real tree transformation"

Then a few weeks later I came to the conclusion that Python is the new Lisp when it comes to metaprogramming. (and async in Python does the same thing that he coded up with continuations.) I think homoiconicity and the parenthesis are a red herring, the real problem is that we're still stuck with parser generators that aren't composable. You really ought to be able to add

   unless(X) { ... }
to Java by adding 1 production to the grammar, a new object for the AST tree, and a transformation for the compiler that rewrites to

   if(!X) { ... }
probably the actual code would be smaller than the POM file if the compiler was built as if extensibility mattered.

Almost all the examples in this book (which claims to be a tutorial for Common Lisp programming)

https://www.amazon.com/Paradigms-Artificial-Intelligence-Pro...

are straightforward to code up in Python. The main retort to this I hear from Common Lisp enthusiasts is that some CL implementations are faster, which is true. Still, most languages today have a big helping of "Lisp, the good parts". Maybe some day the Rustifarians will realize the wide-ranging impacts of garbage collection, not least that you can smack together an unlimited number of frameworks and libraries into one program and never have to think about making the memory allocation and deallocation match up.



Peter Norvig himself has come around to embracing Python as an alternative to Lisp:

https://norvig.com/python-lisp.html

https://news.ycombinator.com/item?id=1803815

and there is indeed a Python implementation for the PAIP programs.

https://github.com/dhconnelly/paip-python

GC conferring additional composability is an interesting take - I hadn't thought of that (though I don't spend much time in this domain).


Just imagine what a godawful mess it would be to write really complex libraries that are supposed to be composable in C. (I'm going to argue this is why there is no 'npm' or 'maven' for C)

The point of programming C is it is very low level and you have complete control over memory allocation so you're losing much of the benefit of C if you have a one-size-fits-all answer.

The application might, in some cases, pass the library a buffer that it already allocated and tell the library to use it. In other cases the application might give the library malloc and free functions to use. It gets complicated if the application and library are sharing complicated data structures with a network of pointers.

In simple cases you can find an answer that makes sense, but in general the application doesn't know if a library is done with some memory and the library doesn't know if the application is done with it. But the garbage collector knows!

It is the same story in Rust, you can design some scheme that satisfies the borrow checker in some particular domain but the only thing that works in general is to make everything reference counted, but at least Rust gives you that options, although the "no circular references" problem is also one of those design-limiting features, as everything has to be a tree or a DAG, not a general purpose graph.


The reason there is no npm/maven for C, is because UNIX culture prefers Makefiles and packages on whatever format the actual UNIX implementation uses.

Depots on Aix, pkgsrc on Solaris, tgz/rpm/deb on Linux, ports on BSDs,...

In any case, I would argue that we have npm/maven for C, and C++ nowadays, via CMake/Conan/vcpkg.


> I think homoiconicity and the parenthesis are a red herring, the real problem is that we're still stuck with parser generators that aren't composable.

You might be interested in "PEP 638 – Syntactic Macros" (https://peps.python.org/pep-0638/), thought it hasn't gotten very much attention.

I have similar (or at least related) thoughts for my own language design (Fawlty), too. My basic idea is that beyond operators, expressions are parsed primarily by pattern-matching, and there's a reserved namespace for keywords/operators. For example, things like Python's f-strings would be implemented as a compile-time f operator, which does an AST transformation. Some user-defined operators might conditionally either create some complex AST representation or just delegate to a runtime function call.

> Rustifarians

Heh, haven't heard that one.


I don't even think you can define homoiconicity both properly in a way that captures what you want to capture.

Even the Wikipedia introduction at https://en.wikipedia.org/wiki/Homoiconicity agrees that the concept is 'informal' at best and meaningless at worst.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: