Hacker News new | past | comments | ask | show | jobs | submit login

Hmm, I looked around on that post at all the parts referring to Nock and I still don't see anything that would convince an impartial reader of what you're saying.

Are you taking issue with the style/language the post is written in? I mean it does sound to me like someone a little too fascinated by their own ideas... —but I it seems too uncharitable to dismiss it because of that.

What I got from my quick skim is that the author was well aware of some similarity to combinatory logics, but intentionally took a different direction with Nock:

> Unlike the lambda calculus, Nock represents code as data, not data as code. This is because it is system software, not metamathematics. Its loyalties in this matter are not even slightly divided.

Or

> As you'll quickly see if you try this exercise, raw Nock is not a usable programming language. But nor is it an esoteric language or automaton, like SK combinators. Rather, Nock is a tool for defining higher-level languages - comparable to the lambda calculus, but meant as foundational system software rather than foundational metamathematics.

This sounds reasonable to me. I don't know if Nock successfully fulfills this claim, but without assuming the author is just straight up lying about this stuff, it seems like, "Any combinatory logic would do the trick" is probably not accurate.




Have you ever implemented a programming language before? You are relying a lot on "quick skims" and the POV of an "impartial reader", i.e. someone who does not know the domain. It's much easier to see through the bullshit if you're familiar with PL.

> Unlike the lambda calculus, Nock represents code as data, not data as code. This is because it is system software, not metamathematics. Its loyalties in this matter are not even slightly divided.

This does not mean anything.

> But nor is it an esoteric language or automaton, like SK combinators. Rather, Nock is a tool for defining higher-level languages - comparable to the lambda calculus, but meant as foundational system software rather than foundational metamathematics.

This is hilarious considering Nock is, in fact, esoteric. SK combinators are also very comparable to lambda calculus, and it's not clear what "foundational system software" even means in the context of a combinatory logic just like SK combinators. And why would Nock be any good at "defining higher level languages"? Keep in mind that their idea of "higher level languages" includes Hoon, another esoteric language.

You skimmed it quickly and didn't see anything that jumped out to you as wrong. That's a good explanation for how a cup and ball game like Urbit got as many users as it did: you need a bit of specialized knowledge to see that it's full of hot air.


I think you've got me wrong on this—the idea that I'm "relying" on "quick skims" and the POV of an "impartial reader," would imply that I'm somehow using these things against you. I literally am an impartial reader trying to understand this, who spent a little time reading a resource you pointed me at.

If I missed what you wanted me to discover in the resource, you could just highlight it here. But instead you've chosen to question my competence and make another series of ungrounded derogatory assertions about the tech.

Now including some mistakes, or at least overly uncharitable interpretations that are useless to an impartial reader:

> This is hilarious considering Nock is, in fact, esoteric.

It's esoteric sure, but it seems like his usage here refers to Esoteric Languages (https://en.wikipedia.org/wiki/Esoteric_programming_language) —which has a separate meaning (e.g. languages "as a proof of concept, as software art,"). That seems to be the whole point when he compares it to e.g. lambda calculus: he is maximizing practicality over mathematical elegance. So it makes perfect sense to point out that while it's esoteric it's not an Esoteric Language.

> This does not mean anything.

Again, if you take an uncharitable extreme, I agree: it's basically a meaningless thing to say. But if I wanted to be charitable I could come up with several ways of interpreting it.

> SK combinators are also very comparable to lambda calculus

Yes, I got the sense he was using them pretty much interchangeably here, using SK combinators as an example of an esoteric language, and in the next sentence he prefers "lambda calculus" because its a formal system in mathematical logic rather than a computational system, and the whole purpose of the sentence is to make a distinction between software foundations vs. mathematical foundations.

Unless you show an interest in exchanging ideas over rhetoric in your reply—don't expect any further responses from me.


I'm not questioning your competence, most people have not implemented a PL before. It's not a question of competence, it's a question of specialized knowledge.

> It's esoteric sure, but it seems like his usage here refers to Esoteric Languages

Yes, that's the sense that I meant it in. Nock is an esoteric language just like those others on that website.

> he is maximizing practicality over mathematical elegance.

Nock is not practical. He is certainly not maximizing practicality by making a combinatory logic that can only increment integers and then recognizing and "accelerating" a particular implementation of decrement! Let's look at Nock:

    ?[a b]           => 0
    ?a               => 1

    ^[a b]           => ^[a b]
    ^a               => (a + 1)

    =[a a]           => 0
    =[a b]           => 1
    =a               => =a

    /[1 a]           => a
    /[2 a b]         => a
    /[3 a b]         => b
    /[(a + a) b]     => /[2 /[a b]]
    /[(a + a + 1) b] => /[3 /[a b]]
    /a               => /a

    *[a 0 b]         => /[b a]
    *[a 1 b]         => b
    *[a 2 b c d]     => *[a 3 [0 1] 3 [1 c d] 
                            [1 0] 3 [1 2 3] [1 0] 5 5 b]
    *[a 3 b]         => **[a b]
    *[a 4 b]         => ?*[a b]
    *[a 5 b]         => ^*[a b]
    *[a 6 b]         => =*[a b]
    *[a [b c] d]     => [*[a b c] *[a d]]
    *a               => *a
This is what I mean by being too charitable. There is absolutely nothing whatsoever that is practical about Nock.

> because its a formal system in mathematical logic rather than a computational system

I really don't think this means anything. The lambda calculus is one of the paradigmatic computational systems, and SK combinators are closely related. And it turns out formal systems in mathematical logic are also related through the Curry-Howard-Lambek correspondence. So there's no meaningful distinction between these things.

> I think you've got me wrong on this—the idea that I'm "relying" on "quick skims" and the POV of an "impartial reader," would imply that I'm somehow using these things against you. I literally am an impartial reader trying to understand this, who spent a little time reading a resource you pointed me at.

No, what I'm saying is that, as an impartial reader, you are giving them too much of the benefit of the doubt, and also not spending a lot of time with it. There's nothing wrong with that, but it means you can get taken for a ride by people who are making a smokescreen.


Okay, I'll try to be charitable with you here and assume you've just missed this.

The problem with claims like:

> Nock is not practical. He is certainly not maximizing practicality by making a combinatory logic that can only increment integers and then recognizing and "accelerating" a particular implementation of decrement!

Is that you are evaluating a component of a larger system in isolation. The form of your argument is that this component, which is a language, has obvious deficiencies/impracticalities that can be established in terms of the language itself and how it relates to other languages.

But this doesn't take into account how the language (which is the component I spoke of) relates to the system in which it's embedded.

The appearance of impracticality is often present in foundational systems. How easy would it be do use your same form of argument and deride the Peano axioms as obviously impractical because: look how much work it is to do arithmetic with a 'successor function'—ridiculous! (i.e. the 'impracticality' depends completely on what you're trying to do with it)

This was what I was trying to investigate from the beginning: how have you verified that Nock's peculiarities don't have legitimate reasons in the context of the system of which it is a component?

If you review our conversation, you'll see that's what I was trying to figure out since my very first statement—and to this point, you have still not addressed it.


The initial poster is likely weary. I’ll try my hand exactly once. I know it can be frustrating for both parties, because sadly, to know that an expert is right, you need a bit of expertise.

Here we go.

Symmetries are useful to understand a domain. If you have one, you know that a point on the left will appear on the right. So question like “but how do you know that there really is a point on the right?” is simply because you know the proof of symmetry.

The answer to “how have you verified that Nock's peculiarities don't have legitimate reasons in the context of the system of which it is a component?” is in the same way. They mention Curry-Howard correspondence: I don’t want to lose too much time on it, but yes, it is a symmetry, so the point on the left called Nock has a corresponding point on the right called lambda calculus with the same properties; and the claims made by Nock’s author about there being no symmetry are disproved by old proofs.

In fact, there are a lot of 50-year-old systems that fulfill Nock’s goals. Since it is an old field, the majority of them had time to develop into practical systems that are much, much clearer and less esoteric.

Which brings me to the goal of Nock. Why do something esoteric when a bit of field knowledge shows the complexity is unwarranted?

One possibility is NIH.

A more likely explanation is getting users into the sunk cost fallacy. Why more likely? Because this technique has been used repeatedly by the author in other fields, like his political essays. I am not the first to have that thought; a quick Google search brings this: https://www.lesswrong.com/posts/6qPextf9KyWLFJ53j/why-is-men...

It does work. So many pixels spilt over an awkward language named after a famous anti-semite admired by the author, Albert Jay Nock.


Hi espadrine. I appreciate your well-intentioned comment, but I believe the issue here is not about grasping the symmetry argument, rather it's that the symmetry argument is not an answer to the the actual question I've been posing since the beginning.

Please correct me if I'm wrong—but the symmetry argument is basically: Nock could be classified as a concatenative combinatory logic, which is a variation of "classical" combinatory logic; this classical variation, or a particular incarnation like the SKI combinator calculus, can be viewed as a variation on the untyped lambda calculus.

That establishes an association between Nock, the alternative posted by xkapastel (which is a concatenative combinatory logic), and the lambda calculus.

So from here we can say, "why not just use the non-obscure bit of combinatory logic instead?" (we choose this over lambda calculus since Nock's approach appears concentative too, avoiding lambdas)

And my answer is that if all you need is a concatenative universal model of computation—then you're done!

But—where is it actually established that that is the only requirement for Nock in the context of Urbit? That is the question I've been asking the whole time, and which is not answered by the symmetry argument.

Nock's essential structure may be that of a concatenative combinatory logic, but that doesn't mean there aren't other aspects of its design which are important for how it relates to other particulars of the Urbit system (for instance: maybe this variation has nice performance properties in connection with other parts of Urbit—I don't know).


You: Can you tell me why this is interesting?

Them: It's not interesting.

You: Idk this vague sentence sounds kind of interesting to me. Can you tell me why this is interesting?

Them: Trust me it's not interesting.


It seems like you jumped into the conversation part way.

Here is my actual opening:

> Just out of curiosity, have you verified in some way that your alternative is legitimate?

The entire conversation has been me trying to to get the other dude to verify (unsuccessfully) a claim he made in the opening comment.

A more accurate re-cap:

------

Them: Nock is pointless, you can just use X

Me: How do you know X would work for all Urbit's requirements?

Them: Of course it would! Nock is pointless. Read this article.

Me: I couldn't find anything in the article to support your claim.

Them: You didn't read the article correctly.


Wow...we have very different views of this conversation :)


From an outside perspective, that's really not how that conversation came across


It's not that the author is lying per se about the claims, it's that the claims are somewhere between meaningless and pointless. What is the difference between "code as data" and "data as code" and why does it matter? What is the use of a not-directly-usable programming language which can define higher-level languages? In which meaningful ways does, say, LLVM IR not treat "code as data" and not count as a not-directly-usable programming language on which other languages can be built?

I am not saying, nor is 'xkapastel, that there is no difference between Nock and LLVM IR or any combinatory logic or whatever. We're asking what the point of the differences is. Anyone can make up a language (or any other object or idea) and define claims about that language that no other language satisfies. The question is whether those claims actually represent something meaningful and worthwhile ("interesting properties," as 'xkapastel put it) or just unique. That's a matter for the judgment of the reader, and we cannot just defer to the judgment of the claim-maker that the claims are worthwhile.


LLVM IR treats code as data, like Nock, because LLVM IR, like Nock, is systems software, not metamathematics. Combinatory logic, by contrast, is metamathematics, and in combinatory logic, if you want numbers or lists, much less byte strings, you have to build them out of code, using Church numerals and the like. So, in this way, Nock is similar to LLVM IR and different from combinatory logic. There are a variety of practical benefits to representing code as data in this way, which I do not feel the desire to explain in detail at the moment, but suffice it to say that all practical programming languages and virtual machines share this property with Nock.

So, you might ask, do there exist purposes for which Nock is preferable to LLVM IR? Well, I don't know, but Moldbug's intent was for the answer to be "yes"; there are problems he wanted to solve which LLVM IR definitely does not solve. Foremost among these is getting people to impugn one another's competence on the orange website. No, wait, foremost among these is determinacy and permanence: it is the intent that the result of any Nock computation be deterministic and unchanging through all time. The meaning of LLVM IR, by contrast, changes every time someone fixes a bug in an LLVM backend, or introduces a new one.

You will note that this way that Nock differs from LLVM IR is also a way that SK-combinators differ from LLVM IR. SK-combinators also provide a deterministic computational system that is simple enough that we can imagine not changing it. That is why it was necessary to explain how Nock differs from SK-combinators.

So Nock shares some crucial attributes with combinatory logic and others with LLVM IR, but neither system has the combination of characteristics that Nock does.

Now, why would anyone want a deterministic, permanent virtual machine definition? I don't know why Moldbug wanted it. There are some reasons that I want it; I think reproducible computational experiments are an important way to communicate human knowledge, a way that could perhaps drastically reduce the loss of knowledge from one generation to the next, accelerating the ratchet of philosophical progress which made the modern world possible.

However, I don't want to have anything to do with Urbit! I don't want to use a virtual machine that people credibly say is named after a famous anti-Semite. I also think some of the technical choices in Nock are questionable and may turn out to be fatally flawed.

I despair somewhat in writing this comment because I feel that everything I'm saying is painfully obvious, but obviously there is something about the people I am talking to that makes it not obvious to them. But since I do not know what that is, it seems likely that my own words will be just as incomprehensible to them as Moldbug's voluminous, and to me perfectly clear, explanations. And the level of viciousness in the air makes me think that probably those people will respond to their own incomprehension by launching personal attacks on me, as they have been on each other. It's pathetic.


I think you won't be surprised that a) I don't disagree with any of what you wrote, b) I don't think it's super on topic, and c) I think claim b isn't obvious.

For the sake of argument, let's say "a deterministic, permanent virtual machine definition" is a desirable quality. LLVM IR at the latest version is a poor tool for this, yes. But LLVM IR at some fixed version of LLVM works fine. It's weird because now you can't use up-to-date versions of LLVM with it, but it's doable: pNaCl did exactly this, and dealt with forking the LLVM codebase in order to freeze the IR definition. If you don't like that, Java bytecode at some fixed version of the JVM probably works, .NET CIL probably works, WASM works (although it didn't exist when Nock was designed), etc. In fact, Urbit's web page compares itself to the JVM and WASM. Perhaps you could use Infocom's Z-machine, or BPF, or something. Alternatively, maybe you just want i386 assembly or ARM2 assembly or MIPS II or whatever. With a validator to make sure you're not using undefined/newer operations, and with some restrictions on threading and input (including input randomness), they're deterministic and permanent, and you've got a wide array of software tools for generating it.

But I think "why would anyone want a deterministic, permanent virtual machine definition" needs to be answered, if we're going to have any idea of what the topic of this comment chain is. We're talking about whether Nock and Hoon are intentionally obscure, which means talking about whether there exist less-obscure things that solve the same problem.

Moldbug's claim - which, even if he doesn't say it in as many words, seems obvious from the history of this conversation and how participants have read his work - is that a) there exists a set of constraints that describe a thing worth wanting for a real-world application, and one of those constraints is that it be a deterministic, permanent virtual machine, and b) his invention, Nock, is a novel solution that satisfies those constraints. If you don't have that, then Nock is, as 'xkapastel puts it, "somewhere in between an art project and a pyramid scheme."

So, for your use case, I might ask you whether the JVM or i386 assembly or whatever answers it, but for the use case of Urbit, we need to figure out what exactly those constraints are. And then we can evaluate whether Nock and Hoon are novel because they solve problems that haven't been solved, or whether indeed there exists a less-obscurantist solution that satisfies the constraints, and thus Nock and Hoon are simply exercises in "look how smart I am" / "you'd never have gotten this from mainstream software engineering, aren't you glad you listened to me" / whatever.

In other words, I think we are off-topic by discussing the merits of Nock in the abstract, and then constructing scenarios in which we find Nock valuable. (And, to the extent that Nock and Urbit as a whole are potentially exercises in crypto-fascist-advocacy, this would be playing into Moldbug's plan - Moldbug's philosophical work takes much the same approach, of making an interesting-sounding claim and then trying to convince you it has real-world relevance because it's just so interesting-sounding and just so different from mainstream opinions that it must have value, and it too can be defeated by keeping "Okay, what problem are you trying to solve" front and center, and therefore Moldbug wants you to forget about that. If we say "Nock is cool, but I wish it weren't named after an anti-Semite," we make people wonder about the anti-Semite; if we say "Nock is, actually, an unnecessarily-complicated rehash of things that already existed, named after an anti-Semite and dressed up to sound cool," then people understand what's going on.) We should first figure out what the problem Urbit needs Nock for, and then we can have a fair discussion about whether Nock is a meaningful invention.

Incidentally, https://ngnghm.github.io/blog/2016/06/11/chapter-10-houyhnhn... is an argument that Urbit, as a project, doesn't need a deterministic, permanent virtual machine definition at all - that there isn't a need for future-proof reproducible behavior in the system, that the constraint on using the Nock VM makes Urbit perform poorly, that the escape hatches to help Urbit's performance undermine the whole project (apparently the Urbit runtime statically recognizes a particular Markdown implementation in Nock and replaces it with a call to a different Markdown implementation in C), that the "simplicity" of Nock makes it harder to understand the high-level purpose of programs and defeats Urbit's claim that programs are understandable, that the Nock VM prevents an Urbit user from doing effective runtime resource control which would be a useful feature in practice, etc.


"LLVM IR at some fixed version" seems on the face of it like a reasonable attack on the problem, yes, and you can get close to solving the reproducibility problem that way. But you can't actually solve it, because LLVM IR is two to four orders of magnitude more complicated than Nock, so it's full of definitional ambiguities, which mean that in practice it is defined by the behavior of its implementation, which is full of bugs, some of which must be fixed because they are security holes. Even without fixing the bugs, its behavior will vary depending on what compiler it's built with and what CPU and OS it runs on. Operations as commonplace as sequences of two floating-point additions can give different results, sometimes radically different, for example in the presence or absence of guard digits, to say nothing of non-IEEE-754-compliant hardware like Intel Gen. In this particular case GCC has a flag, -fstore-float, to get deterministic behavior on some hardware, so you can read about the issue.

The same flaw (for this purpose) is present in JVM bytecode, i386 machine code, ARM2 machine code, CIL, and wasm. (e)BPF and the Z-machine might be plausible bases for a solution on those grounds, but you'd still need a lot of work to be sure you'd rooted out the sources of irreproducibility.

Now, as I said, I'm not convinced that Nock in practice solves the problem it sets out to, and Faré's criticisms that you have helpfully linked to are a major reason for that. But nothing that existed does in fact set out to solve that problem, much less succeed. It's true that some other computational basis would also have worked, given enough effort; but I think you are seriously underestimating the amount of effort required.

I do think my previous comment already explained why I think rigorous computational reproducibility is valuable.


I mean, I'm happy to say that neither Nock nor anything else solves the problem, and I think that gets to the same conclusion: Nock is not a useful/worthwhile invention, it is an obscurantist rehashing of old ideas with more emphasis on looking elegant than being elegant.

That's what I'm pushing back on. There's an idea that because Nock exists and looks cool, it must mean something. And just like the idea that because the rest of the blog where Nock was introduced exists and looks cool, it must be a meaningful philosophical contribution, it's not true.

For instance - Nock, as both its advocates and detractors readily point out, is too low-level to be a language to actually write in. So that means you're writing in some other language which has an implementation in Nock (or perhaps in Nock-and-hopefully-equivalent-C, how's that for security risk surface). And that other language is rich enough to have security issues just like LLVM, and it's either frozen and you accept the potential insecurity, or it's patchable and you accept the potential nonreproducibility. You haven't solved the problem, you've just shoved it up one level so you can get a cool blog post about your elegant lower level.

I agree, I'd love to have a language that solves the problem you describe. I don't think Nock is tangibly closer to solving it than other things are, nor do I think it is tangibly closer to solving the Urbit system's needs than other things are, nor do I think it has some other unspecified merit. And I certainly feel no obligation to figure out some use case which would make Nock look meritorious.

(And, again, Google shipped pNaCl as a way to run untrusted native code. They had to do something to deal with the very concerns you raise. And unlike Urbit, the stakes for getting it wrong were very real.)


Well, it's good that we've gotten to the point of us both agreeing that what is Nock designed to do is novel and useful. It's certainly up in the air whether Nock will succeed in doing it, but I think it has a better chance of doing it than systems that don't even try.

I want to point out an error in your reasoning here, though, which results in you seriously underestimating the importance of Nock's goal:

> you're writing in some other language which… [i]s either frozen and you accept the potential insecurity, or it's patchable and you accept the potential nonreproducibility

Suppose I gather some data, do some statistical predictions based on it, plot the results, write them up, and publish the whole bundle as a set of programs for some reproducible computing system, including a build script that rebuilds the article from the source text, the observation data, the statistical code, and so on, with a secure hash of the whole thing. I don't even need to include a PNG of the plot in the bundle — anyone in the future, whether in 02021, 02030, 03020, or 12020, can recompute that bitwise-identical plot from the source data. Moreover, if I made an error in my analysis — either due to a software bug or for any other reason — they can reproduce that error.

Suppose the statistical code is compiled from a Julia-like language to bytecode for the reproducible VM. To achieve reproducibility, I could include the compiled bytecode in the bundle, but a better solution is to include the version of the compiler that I was using. That way, someone who wants to criticize my conclusions in the future can recompile my statistical source code with a new version of the compiler to see if my results were due to a compiler bug. Lacking that, they can reproduce my incorrect results, and they can recompile my source with a new compiler and get a different executable that produces different results, but it will be harder to tell why they were different.

In neither case, though, does fixing compiler bugs destroy the reproducibility of the computation, as you say it does. The executable compiled by the buggy compiler will continue to produce unchanged results on future versions of the VM, as long as they are not buggy. Shoving the problem up one level makes it into a completely different kind of problem with completely different implications.

I don't think there's any other system out there that attempts to deliver this level of reproducibility today, although some video game emulators come close. I'm probably going to have to write one myself, because I'm not convinced Nock is going to achieve it. Substantial parts of Dercuano and BubbleOS are steps toward this goal.


>> Unlike the lambda calculus, Nock represents code as data, not data as code. This is because it is system software, not metamathematics. Its loyalties in this matter are not even slightly divided.

Isn't that pseudo-profound (i.e. sounds grand but says nothing)? It seems like you could say the same thing about assembly or Java.


> it does sound to me like someone a little too fascinated by their own ideas

This is perhaps the most simultaneously concise and accurate description of Curtis Yarvin, or at least of that of himself which he chooses to display in his writing, that I can imagine ever encountering.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: