Hacker News new | past | comments | ask | show | jobs | submit login

This explanation is trivially incorrect if you actually read the quote from Wittgenstein:

>My propositions serve as elucidations in the following way: anyone who understands me eventually recognizes them as nonsensical, when he has used them—as steps—to climb beyond them. (He must, so to speak, throw away the ladder after he has climbed up it.)

It is his propositions (the propositions he lays out in the Tractatus - the book is structured as a list of propositions) that form the "ladder", not language itself.

I'm really not looking forward to people having dozens of hours of hallucination-filled conversations with ChatGPT and then thinking that they've learned something. (I've already seen it tell someone that "C++ is typically compiled to intermediate code and executed on a virtual machine, making it less performant than C".)




This! I read ChatGPT‘s explanation and it immediately mentioned language and I’m like, “huh! no! that’s late Wittgenstein!” Early Wittgenstein of the Tractatus is very different; the ladder is not about language at all. As usual ChatGPT is showing it’s good at globbing together a bunch of related concepts and sounding very convincing about it, while also being quite wrong.

(Just read the Wikipedia page, c’mon.)


They’ve learned something after those hours. Just like I learned something from this hallucination. I never really did consider things like using your fingers to count as a kind of language and how you often toss them away once done. That really resonates with me. I’m going to be thinking about it.

But to your point: it may not at all be accurate in terms of asking for a specific definition of a thing.

Is AI going to screw up semantics for us? Where we muddy the meaning of things like above?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: