Hacker News new | past | comments | ask | show | jobs | submit login

Sub-par intelligence crashes some cars, causes some wars. Superintelligence turns our whole forward light-cone into paperclips; it's a bit worse.



It doesn’t need to be super to do that — even bacteria are a type of von-Neuman machine that can turn us into more of themselves.

An ASI could convince you that turning yourself into a paperclip is a fun and exciting new opportunity to liberate you from $PERSONAL_PAIN and finally allow you freedom to engage in $PERSONAL_FANTASY.


Bacteria are self-limiting. Their offspring compete and they don't reshape their environment to be more friendly to their existence. They don't plan for space colonization either. They're a bit of a toy model for a maximizer, not the real threat of an intelligent, self-modifying maximizer.


The only argument that seems to be brought forward every time is "if a superintelligent being wants to do stuff, we can't do anything about it because the axiom is that superintelligence means omnipotence". This is not so different from any other religious argument since it's impossible to falsify and therefore meaningless in any scientific sense.


Superintelligence doesn't mean omnipotence. The thing is bound to physics. It's just smarter than you, to some greater or lesser (usually imagined as greater) degree.


That’s the point. They don’t need to be anything like as potent as a AGI or ASI to be an existential threat.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: