Hacker News new | past | comments | ask | show | jobs | submit login

I've thought about this a bit lately. With increasing surveillance and sensors everywhere, we're feeding more and more details about human behavior into the cloud, which is stored, data-mined, compressed, various statistics are derived and used in all kinds of ways. Let's say this, over a long time, is equivalent to learning a input to output mapping of the human mind and humanity as a whole.

Combined with the current focus on markets, profits and efficiency, this does not bode well. One could arrive at a complete humanity simulation (without even intending to, just to optimize stuff, there is nothing inherently evil or unfriendly). It will eventually become so good at modeling us (at least, humanity as a whole), that wasteful, inefficient, cluttered actual humans aren't needed anymore.

At the same time we're becoming more and more dependent on the external information processing. There will be no terminator-like war with strong AI. We'll be unable to live without it. We'll just fade away, on autopilot, as we matter less and less, brain function after brain function better handled by computers. It's a bit of a dark future vision, at least compared to the "individual mind uploading and living forever" of the singularity optimists, but I have a hard time getting around it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: