I'm far from an expert on such things, but Co-dfns appears to be genuinely cutting edge research that could have utility for many language ecosystems (or do you disagree that this direction of GPU-based compilation holds promise?). If that work has to look like a "mess" to non-APLers in order to get done then so be it.
I criticized the approach some in [0], after writing the BQN compiler in a similar style. Pioneering the array-based compiler paradigm is a major accomplishment, and if you did any programming with Aaron, you wouldn't question his ability. The Pareas compiler demonstrates that an APL-like language isn't required to do it though. And while the research may be applicable to other problems, I have serious doubts that it will be useful for speeding up optimizing compilers, which spend time on very different things than simple ones. Also possibly worth mentioning that Co-dfns still doesn't self-host, which I believe means it can't run on the GPU. I'm still unclear on what GPU program was timed in his thesis, a hand-translated version?
Thank you for the link! That's a very interesting overview. Do you think the commodification of FPGAs could add another angle of interest/motivation here? Or is that hardware model likely to be incompatible with how these compilation techniques work?
I'm personally rather curious about the intersection of compilers and database query engines, where ~cheap dynamic/incremental compilation across extremely diverse runtime workloads is often a basic requirement.
As in, compiling for FPGAs or running on them? Can't really say anything about compiling-for. I know building Verilog takes forever but I have no idea what it's doing.
For running-on, APL adapts well to most hardware. I mean, we're talking about a paradigm that's existed longer without the concept of GPGPU than with it. I don't know if FPGAs would have an advantage over GPUs, since the bottleneck is often memory bandwidth and from a quick search it seems FPGAs are worse there. But the problem is still implementing compiler passes in APL.
Ah I was thinking about running-on, but compiling-for is probably relevant also :)
> I don't know if FPGAs would have an advantage over GPUs, since the bottleneck is often memory bandwidth and from a quick search it seems FPGAs are worse there
Thanks, that makes sense, though I believe (and hope) the situation can reverse.
> Brags about how easily he understands his giant ball of overly messy code (2- letter variables in the name of “semantic density”)
@arcfide left one of my favorite comments ever, in response to another comment similar to yours (for which the author apologized). I refer to this often.
> “Don't complain that Chinese is ugly and unreadable just because you speak English as your native tongue.”
The analogy here slightly breaks down since there are billions of people who speak and read Chinese already, while apparently the coding style in question is more of a niche thing trying to justify itself.
I don't know enough about APL or using 2-letter variables to comment on whether they're actually justified, but my personal hunch is that their usefulness is going to be limited to particular problem domains and people with particular inclinations. (The same goes for verbosity on the other extreme like particular Java projects...)
From memory (not very reliable but I'm not going to rewatch hours of video to find sources), Aaron said that when he asked around about working on a nano-pass compiler as a PhD thesis, nobody would take him seriously because it was believed to be impossible to do in a performant way. And he has done it, with this style of code. An world first. And it worked out orders of magnitude faster and less memory and less code than a comparable compiler written in Scheme (which Aaron is pretty good with - he used to be involved in the Scheme R5RS and R6RS standards committees).
He's also built the world's first APL compiler which runs on a GPU and outputs GPU code.
> - Brags about using Notepad instead of the fancy IDEs that us weak-minded plebians use
One of his points is that he's convinced programming as an industry went wrong by sidelining array languages, and we shouldn't need to be writing so much code to get things done. It's not so much bragging about using Notepad, as making a point that a) Notepad is enough for a complex piece of code, b) IDEs can't help much with tacit array code because there isn't tons of boilerplate to autocomplete, and c) concentrating on the code is easier without a lot of distractions.
> - Brags about how he can understand his own code. We can all understand our own code. Good code is understandable by others.
Another one of his points is that people who have not learned traditional programming find array-based programming much easier to pick up. And that's aside from the usual discussions here where you're basically saying "Brags about understanding Chinese. We can all understand our own native language, Good writing is written in English". Other people have contributed pull requests to co-dfns (apparently).
> - Brags about how easily he understands his giant ball of overly messy code
Given your previous criticism is that it's "not understandable", emphasising how it not only is understandable it's quite easy to understand, seems like a reasonable rebuttal for him to make. Apparently he's damned if he does, and damned if he doesn't.
> - (2- letter variables in the name of "semantic density")
That is the array style; his compiler is tacit (no variables) as much as he can make it, which means the few variables which are left are much easier to keep track of in memory. Regardless of that, there's a big gap between languages where people write 'k[i]=m[j]uv' and languages where people write 'pyramidPointVector[pyramidPointVectorOffset]=adjustedShadowMaskBools[rowAlignmentCounter]...' or whatever.
> - Uses an obscure (in the modern era) outdated language like some Cinema aficionado who refuses to watch movies in color like all those plebians
But also like someone who uses any highly specific tool for experts instead of the standard thing an ordinary person might find at Walmart.
> Another one of his points is that people who have not learned traditional programming find array-based programming much easier to pick up.
I have had the same experience when teaching SQL.
I feel like it is difficult for people who have worked with imperative languages to grasp, while people who come from the Excel school (e.g. business people or researchers) actually enjoy it and may even pick it up faster.
I believe SQL and array languages have this and probably much more in common. I am probably out of my league here, but I think I'd call SQL a set language rather than array language. But feels like they are related.