Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Zero? Aren't you a little bit overconfident on that?

Transformer LLMs already gave us the most general AI as of yet, by far - and they keep getting developed further, with a number of recent breakthroughs and milestones.





No. The fundamental encoding unit of an LLM is semantic. Mapping reality into semantic space is a form of lossy compression. There are entire categories of experience that can't be properly modeled in semantic space.

Even in "multimodal" models, text is still the fundamental unit of data storage and transformation between the modes. That's not the case for how your brain works—you don't see a pigeon, label it as "pigeon," and then refer to your knowledge about "pigeons". You just experience the pigeon.

We have 100K years of homo sapiens thriving without language. "General Intelligence" occurs at a level above semantics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: