Hacker News new | past | comments | ask | show | jobs | submit login

We are incredibly far away from AGI and we're only getting there with wetware.

LLMs and GenAI are clever parlor tricks compared to the necessary science needed for AGI to actually arrive.




What makes you so confident that your own mind isn't a "clever parlor trick"?

Considering how it required no scientific understanding at all, just random chance, a very simple selection mechanism and enough iterations (I'm talking about evolution)?


My layperson impression is that biological brains do online retraining in real time, which is not done with the current crop of models. Given that even this much required months of GPU time I'm not optimistic we'll match the functionality (let alone the end result) anytime soon.


I'm actually playing with this idea: I've created a model from scratch and have it running occasionally on my Discord. https://ftp.bytebreeze.dev is where I throw up models and code. I'll be releasing more soon.


Trillions of random chances over the course of billions of years.


Why do you think we'll only get there with wetware? I guess you're in the "consciousness is uniquely biological" camp?

It's my belief that we're not special; us humans are just meat bags, our brains just perform incredibly complex functions with incredibly complex behaviours and parameters.

Of course we can replicate what our brains do in silicon (or whatever we've moved to at the time). Humans aren't special, there's no magic human juice in our brains, just a currently inconceivable blob of prewired evolved logic and a blank (some might say plastic) space to be filled with stuff we learn from our environs.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: