Hacker News new | past | comments | ask | show | jobs | submit login

> We don't know if full AGI can be built using just current technology (like transformers) given enough scale,

We absolutely do and the answer is such a resounding no it's not even funny.




Actually, we really don't. When GPT-3.5 was released, it was a massive surprise to many, exactly because they didn't believe simply scaling up transformers wouldn't end up with something like that.

Now using transformers doesn't mean they have to be assembled like LLM's. There are other ways to stich them together to solve a lot of other problems.

We may very well have the basic types of lego pieces needed to build AGI. We won't know until we try to build all the brain's capacities into a model of size of a few 100 trillion parameters.

And if we actually lack some types of pieces, they may even be available by then.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: