Hacker News new | past | comments | ask | show | jobs | submit login

> It's like most new technologies. In the beginning there are only a few instances that really stand out, and many with issues.

Except this isn't new. This is after throwing massive amounts of resources at it multiple decades after arrival.




What are you taking "it" to be here?

The transformer architecture on which (I think) all recent LLMs are based dates from 2017. That's only "multiple decades after" if you count x0.6 as "multiple".

Neural networks are a lot older than that, of course, but to me "these things are made out of neural networks, and neural networks have been around for ages" feels like "these things are made out of steel, and steel has been around for ages".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: