Hacker News new | past | comments | ask | show | jobs | submit login

Foom specifically refers to a very fast takeoff to ASI, usually hours/days or at the outside weeks. It’s a term of art, well-defined.

https://www.lesswrong.com/posts/LF3DDZ67knxuyadbm/contra-yud...

It does not refer to a fast takeoff where an AGI self-improves to ASI over multiple years.

It’s also not referring to the AGI threshold. If foom happens it’s probably unambiguous; there is now an entity that is far more intelligent than humans.

I think the foom scenarios are fairly unrealistic, for thermodynamic reasons. But I think it’s perfectly plausible that an ASI could persuade the company that built it to keep it secret while it acquired more resources and wealth over the course of years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: