Hacker News new | past | comments | ask | show | jobs | submit login

> If a corporation can't re-structure itself into being much smarter than the smartest human, its intelligence is fundamentally limited, and therefore so is the risk.

The assumption here is that risk correlates to intelligence. That doesn't seem to be borne out in history. Risk (the likelihood of bad outcomes) can be emergent, and arise from well-meaning, reasonably (not super) intelligent people operating within a simple framework.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: