Hacker News new | past | comments | ask | show | jobs | submit login

I like to think of the non-verbal portions as the biological equivalents of ASICs. even skills like riding a bicycle might start out as conscious effort (a vision model, a verbal intention to ride and a reinforcement learning teacher) but is then replaced by a trained model to do the job without needing the careful intentional planning. some of the skills in the bag of tricks are fine tuned by evolution.

ultimately, there's no reason that a general algorithm couldn't do the job of a specific one, just less efficiently.




I mean, the QKV part of transformers is like an "ASIC" ... well, for an (approximate) lookup table.

(also important to note that NNs/LLMs operate on... abstract vectors, not "language" -- not relevant as a response to your post though).


actually I think you are on to something - abstract vectors are the tokens of thought - mentalese if you've read any Dennett.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: