Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t think GPT-4 is AGI, but that seems like a foolish idea. An AGI doesn’t need to be hyperproficient at everything, or even anything. Ask a team of any non-aeronautical engineers to build a rocket and it will go poorly. Do those people not qualify as intelligent beings?


Outperforming humans does not mean outperforming an average untrained human


Why does AGI infer outperforming any humans? Half the humans on the planet perform worse than the average human. Does that make them non-intelligent?


That's the definition given by Sam Altman above


> Ask a team of any non-aeronautical engineers to build a rocket and it will go poorly. Do those people not qualify as intelligent beings?

I suspect you'd have one person on the team that would say "perhaps you'd be better choosing a team that knows what they're doing"

meanwhile GPT-4 would happily accept and emit BS


Have you used GPT-4? I'd criticize it in the opposite direction. It routinely defers to experts on even the simplest questions. If you ask it to tell you how to launch a satellite into orbit, it leads with:

>Launching a satellite into orbit is a complex and challenging process that requires extensive knowledge in aerospace engineering, physics, and regulatory compliance. It's a task typically undertaken by governments or large corporations due to the technical and financial resources required. However, I can give you a high-level overview of the steps involved:




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: