Hacker News new | past | comments | ask | show | jobs | submit login

DNA "generates" the body, which generates behaviour, which affects gene survival, closing the loop.

<rant> It's a syntactic process with the ability to update syntax based on outcomes in the environment. I think this proves that syntax is sufficient for semantics, given the environment.

Wondering why Searle affirmed the opposite. Didn't he know about compilers, functional programming, lambda calculus, homoiconicity - syntax can operate on syntax, can modify or update it. Rules can create rules because they have a dual status - of behaviour and data. They can be both "verbs" and "objects". Gödel's incompleteness theorems use Arithmetization to encode math statements as data, making math available to itself as object of study.

So syntax not fixed, it has unappreciated depth and adaptive capability. In neural nets both the fw and bw passes are purely syntactic, yet they affect the behaviour/rules/syntax of the model. Can we say AlphaZero and AlphaProof don't really understand even if they are better than most of us in non-parroting situations? </>




But in the first instance of syntax, what does that bare syntax mean? 1+1=2 as a physical instantiation (say written on paper) only has any relevant causal powers because of humans. DNA is a physical, causally interfacing thing with or without anything else built on top of it/from it. A mathematical sentence sits on a piece of paper just like any random scrabbling of pencil led without a consciousness. DNA is at its barest much much more causally interactive. And syntax is always like this. DNA is morphing itself however.

Remember they are just symbols. Whereas DNA is chemically highly interactive. We could all change conventions and obsolete the “+” back to nothingness. We can’t do that for a chemical in DNA


I think of syntax as applying a rule. A program is just syntactic operations. The idea is that it has two aspects: "program in execution" and "program as data". Rules expressing behavior and rules expressed as data, in which case it can be processed by other rules.

One concrete example is a bootstrapped compiler. It is both data and execution. It can build itself, putting its output as input again. Another example is in math - Gödel's arithmetization, which encodes math statements as numbers, processing math syntax with math operations. And of course neural nets, you can describe them as purely syntactic (mechanical) operations, but they also update rules and learn. In the backward pass, the model becomes input for gradient update. So it is both rule and data. DNA too.

These systems that express rules or syntax that is adaptive, I think they make the leap to semantics by grounding in the outside environment. The idea that syntax is shallow and fixed is wrong, in fact syntax can be deep and self generative. Syntax is just a compressed model of the environment, and that is how it gets to reflect semantics.

This was an argument against Stochastic Parrots and Chinese Room (syntax is not sufficient for semantics) maxim. I aimed to show that purely mechanical or syntactic operations carry more depth than originally thought.


I thought syntax was finite... wasn't that the Turing thing discussed here the other day.. also if your idea is true you'd be limited to the recursively enumerable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: