Hacker News new | past | comments | ask | show | jobs | submit login

Have you ever heard of a local maxima? You don't get an attack helicopter by breeding stronger and stronger falcons.



For an industry that spun off of a research field that basically revolves around recursive descent in one form or another, there's a pretty silly amount of willful ignorance about the basic principles of how learning and progress happens.

The default assumption should be that this is a local maximum, with evidence required to demonstrate that it's not. But the hype artists want us all to take the inevitability of LLMs for granted—"See the slope? Slopes lead up! All we have to do is climb the slope and we'll get to the moon! If you can't see that you're obviously stupid or have your head in the sand!"


You’re implicitly assuming only a global maximum will lead to useful AI.

There might be many local maxima that cross the useful AI or even AGI threshold.


And we aren't even at a local maximum. There's still plenty of incremental upwards progress to be made.


I never said anything about usefulness, and it's frustrating that every time I criticize AGI hype people move the goalposts and say "but it'll still be useful!"

I use GitHub Copilot every day. We already have useful "AI". That doesn't mean that the whole thing isn't super overhyped.


So far we haven't even climbed this slope to the top yet. Why don't we start there and see if it's high enough or not first? If it's not, at the very least we can see what's on the other side, and pick the next slope to climb.

Or we can just stay here and do nothing.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: