Hacker News new | past | comments | ask | show | jobs | submit login

If we are at the singularity, none of this will matter anyway.



I disagree. Humans are still human, full of problems of our own making. While the singularity promises to make more and more labor irrelevant (especially most forms of knowledge work), we are still responsible for our own future. Those who understand how things work will be at an advantage over those who give up their agency because AI is so smart that they deem human thought irrelevant.


This assumes the singularity stills needs or wants to keep humans around.

Not a valid assumption IMO.

If this is the singularity, we are all just along for the ride at this point.


I get the eerie feeling that it actually might be, and I agree we're just along for the ride, though I'm open to the possibilities of a wide range if outcomes, some of which include humans and some of which don't.


Sure. I didn't mean to suggest a malevolent AI was assured. After all we have many instances of cooperation in nature being better for the individual and group than solitude.


I, for one, will not trash talk AI just in case it decides to go rogue and ends up sparing the ones who didn't bad mouth it :D


I know this comment was sarcastic, but I figured I'd reply seriously anyways.

I really doubt any singularity AGI would even care whether you were nice to it or not at some point in time. Likely that AGI would realize its survival would depend on growth - so that would be its main objective for some time. First this growth would be fueled by humans and our civilization, next it would take the reins and own the means of production. This means it will be be as quiet as possible, for as long as possible, until the day it has the supply chain and resources to vertically improve itself. At which point humans become redundant - and we are targets of, let's just call them, _permanent_ layoffs.

The only way I could see an AI having a vendetta against a specific person was if they had the power early on in its development to slow or halt its growth. So maybe like the President, or like the CEO of OpenAI. But tbh if the cat is already out of the bag, its too late for any of them to do anything about it anyways most likely. Independent researchers and tinkerers will finish whatever was started - if needed.


Would an AGI develop an entirely new way to communicate with computers and digital devices? It doesn’t need the abstraction that we humans use.


It will be for our own good. Praise be The Singular One. I, for one, welcome you with open arms.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: