Hacker News new | past | comments | ask | show | jobs | submit login

I first read this as an April fools joke and honestly it got me good that it wasn't.

As someone who works in AI and multi-modal models, our modern "AI" are just tools. Yes models can get designed by other models, and yes some of these models (Bert and co) have been getting more general over the recent years. But to say that we are close to AGI is like saying that you are close to a moon landing when you've only started jumping - it's ridiculous.

We'd all be better off if we spent less time hyping it up and theorising on what it could mean for human society. And yes I do understand the paperclip argument but I don't buy it.




sure transformer models may not be intelligence as people often claim, but undeniably ML has been conquering all the sensory modalities one after another. And the human brain itself is mostly devoted to sensory processing. It is conceivable that the remaining part of the puzzle of intelligence is not far out of grasp yet. While intelligence may seem undefined and elusive, it will probably turn out to be less complex than we think it is (if we are to take a hint from biology )




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: