Hacker News new | past | comments | ask | show | jobs | submit login

i buy this. cf the “lossy jpeg of the web” analogy; sure there’s structure: we put it there!



> lossy jpeg of web

Knowledge is compression https://en.wikipedia.org/wiki/Hutter_Prize


“if you can compress it this much it’s gotta be agi!” “no, we just made gzip better. again.”


My comment wasn’t about agi, but out of curiosity what definition are you using for it?

To be fair, I don’t see a categorical difference between human intelligence and compression either


Then why wouldn't any LLM trained on real data have structure?

I mean we don't live in the chaos realm, atoms make molecules, those molecules make structures, those structures make cells, organs, lifeforms.


the degrees to which the (non)deterministic and (non)atomistic models of the universe have predictive power aside —

yes, sure: you should expect a well trained NN to have an internal structure that approximates that of the statistical distribution of encoded features of the training data into the latent space of the model.

that’s very nearly the definition of “trained”.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: